summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorDan Schult <dschult@colgate.edu>2021-05-16 11:10:45 -0400
committerGitHub <noreply@github.com>2021-05-16 11:10:45 -0400
commit132faa6a20b309a38b23e6911ef99eaaea707252 (patch)
tree99c3ee06e2fd953de01902ee0c265d422e95ab7c
parent527264cee91d6a2e40d863044f10fc8ddc189e6a (diff)
downloadnetworkx-132faa6a20b309a38b23e6911ef99eaaea707252.tar.gz
Add approximation algorithms for traveling salesman problem (#4607)
* Add greedy algorithm for solving TSP Many problems of Combinational Optimization can be represented as graphs. These problems have enormous significance in many aspects of science, but there are not any algorithms to solve some of them in polynomial time. However many, heuristic and metaheuristic algorithms have been published over the past years in order to solve / approximate the solutions to these problems. The purpose of this commit is to add implementation of such algorithms for solve one of the most famous problems of Combinational Optimizations, Travelling Salesman Problem (TSP). A greedy algorithm has been implemented at the moment for this reason. "applications" package has been created which include modules that represent a problem. Each module contains several algorithms for solving the specific problem. At this commit, tsp.py module is added which contains greedy_tsp() function; a implementation of a greedy algorithm. * Fix example error * Trivial changes List of changes: Removal of unnesecary _is_weighted() function Improvements on documentation * Add applications package to setup.py file * Change output of greedy algorithm Algorithm's output is a list of nodes now * Add simulated annealing algorithm Add a metaheuristic local search algorithm for solving TSP * Minor changes * Fix example doc errors * Compatible with python 3 * Move tsp module to algorithms package * Code improvements * Handle small graphs and fix doc examples * Documentation changes and rename variables * Adds Threshold Accepting algorithm for TSP * Implemented maximal matching of minimal weight and created test suite. * Removed useless print * Implemented Christofides. * Coding was missing * Add more general traveling_salesman_problem using christofides Also reconfigure import structure and remove min_weight_matching from module since it is now in matching.py * Add new functions to the docs and minor typos * pep8 fixes * fix pep8 and change .gitignore * Add tests of the approximation namespace update docs in approximation/__init__.py * Fix is_matching to check if edges in G. Other tweaks: doc changes and put not_implemented_for on find_matching functions * Improve is_matching selfloop handling and expand tests * Move tsp to approximation directory. Apply black. * Move tsp tests to approximation tests folder * Attempt to bring tsp up to current code. * commit pep8 that my black didnt change, but pep8speaks did find. ?? * tweak a few things and run black * combine #4083 and #3585 into traveling_salesman.py * Match chistofides output to other tsp functions and adjust calling syntax of tests tweak docs tweak see also section * Put big-O complexity in in-line math env. Prevents sphinx from trying to do variable substitution between pipes. * Minor touchups to christofides docstring. * RST touchups to tsp module docstring. * Rm extra string from tsp module. * Docstring touchups for traveling_salesman_problem. * rst fixups for greedy_tsp docstring. * rst formatting for simulated annealing docstring. * More math in-lining for simulated annealing docstring. * rst and minor grammatical fixes to TA docstring. * Fix path-finding and test all methods for tsp function * the refactoring was incomplete. Now maybe is - Add tests of TSP with all methods. - Refactor tests to match simulated_annealing tests and threshold tests. - Unify treatment of weight so unweighted edges use default weight 1. weight now defaults to "weight" with a default value of 1. - Rename tolerance to max_iterations (tolerance is used for error bound) - Rename iterations to N_inner (each iteration takes this many inner loops) - Introduce idioms like `pairwise` and `cycle.copy()` (over cycle[:]) - Allow passthrough of method kwargs for traveling_salesman_problem Still need to: - add test of case where path is more than one edge less that cycle (incomplete_graph) - require cycle input (maybe make default list(G)??) - consider the complexity claims in the doc_strings * More api changes to TSP functions - `chritofides` now allows (and ignores) selfloops - `move` can be a function as well as "1-1" and "1-0" - `method` for traveling_salesman_problem must have 2 arguments instead of passing kwargs. User must "curry" to set parameters - changed doc_string typos in matching.py * Add test to check that cycle=False can remove many edges * Change init_cycle api to require input from user The idea is to make the user specify the initial cycle to start from rather than relying on the programmers default of a greedy algorithm. To easy usage, I check for a string "greedy" as a shortcut. * Update docs with more correct complexity info. * Check for complete graph now more efficient and selfloops ignored * merge is_matching changes Co-authored-by: Thodoris Sotiropoulos <theosotr@windowslive.com> Co-authored-by: Luca Cappelletti <cappelletti.luca94@gmail.com> Co-authored-by: Ross Barnowski <rossbar@berkeley.edu>
-rw-r--r--doc/reference/algorithms/approximation.rst13
-rw-r--r--doc/reference/algorithms/matching.rst1
-rw-r--r--networkx/algorithms/__init__.py1
-rw-r--r--networkx/algorithms/approximation/__init__.py17
-rw-r--r--networkx/algorithms/approximation/maxcut.py6
-rw-r--r--networkx/algorithms/approximation/tests/test_traveling_salesman.py399
-rw-r--r--networkx/algorithms/approximation/traveling_salesman.py846
-rw-r--r--networkx/algorithms/matching.py57
-rw-r--r--networkx/algorithms/tests/test_matching.py183
-rw-r--r--networkx/classes/graph.py4
-rw-r--r--networkx/tests/test_all_random_functions.py14
11 files changed, 1442 insertions, 99 deletions
diff --git a/doc/reference/algorithms/approximation.rst b/doc/reference/algorithms/approximation.rst
index 5062a528..ca097deb 100644
--- a/doc/reference/algorithms/approximation.rst
+++ b/doc/reference/algorithms/approximation.rst
@@ -91,6 +91,19 @@ Steiner Tree
steiner_tree
+Traveling Salesman
+------------------
+.. automodule:: networkx.algorithms.approximation.traveling_salesman
+.. autosummary::
+ :toctree: generated/
+
+ christofides
+ traveling_salesman_problem
+ greedy_tsp
+ simulated_annealing_tsp
+ threshold_accepting_tsp
+
+
Treewidth
---------
.. automodule:: networkx.algorithms.approximation.treewidth
diff --git a/doc/reference/algorithms/matching.rst b/doc/reference/algorithms/matching.rst
index 331e2b50..194893f7 100644
--- a/doc/reference/algorithms/matching.rst
+++ b/doc/reference/algorithms/matching.rst
@@ -11,3 +11,4 @@ Matching
is_perfect_matching
maximal_matching
max_weight_matching
+ min_weight_matching
diff --git a/networkx/algorithms/__init__.py b/networkx/algorithms/__init__.py
index 3226b0f7..f49e812c 100644
--- a/networkx/algorithms/__init__.py
+++ b/networkx/algorithms/__init__.py
@@ -58,6 +58,7 @@ from networkx.algorithms.wiener import *
# Make certain subpackages available to the user as direct imports from
# the `networkx` namespace.
+import networkx.algorithms.approximation
import networkx.algorithms.assortativity
import networkx.algorithms.bipartite
import networkx.algorithms.node_classification
diff --git a/networkx/algorithms/approximation/__init__.py b/networkx/algorithms/approximation/__init__.py
index 68ef44c2..13fc21f9 100644
--- a/networkx/algorithms/approximation/__init__.py
+++ b/networkx/algorithms/approximation/__init__.py
@@ -1,14 +1,14 @@
-"""Approximations of graph properties and Heuristic functions for optimization
-problems.
+"""Approximations of graph properties and Heuristic methods for optimization.
- .. warning:: The approximation submodule is not imported in the top-level
- ``networkx``.
+ .. warning:: These functions are not imported in the top-level of ``networkx``
- These functions can be imported with
- ``from networkx.algorithms import approximation``.
+ These functions can be accessed using
+ ``networkx.approximation.function_name``
-"""
+ They can be imported using ``from networkx.algorithms import approximation``
+ or ``from networkx.algorithms.approximation import function_name``
+"""
from networkx.algorithms.approximation.clustering_coefficient import *
from networkx.algorithms.approximation.clique import *
from networkx.algorithms.approximation.connectivity import *
@@ -18,6 +18,7 @@ from networkx.algorithms.approximation.kcomponents import *
from networkx.algorithms.approximation.matching import *
from networkx.algorithms.approximation.ramsey import *
from networkx.algorithms.approximation.steinertree import *
-from networkx.algorithms.approximation.vertex_cover import *
+from networkx.algorithms.approximation.traveling_salesman import *
from networkx.algorithms.approximation.treewidth import *
+from networkx.algorithms.approximation.vertex_cover import *
from networkx.algorithms.approximation.maxcut import *
diff --git a/networkx/algorithms/approximation/maxcut.py b/networkx/algorithms/approximation/maxcut.py
index 47e5c70c..d2b5eced 100644
--- a/networkx/algorithms/approximation/maxcut.py
+++ b/networkx/algorithms/approximation/maxcut.py
@@ -1,10 +1,10 @@
import networkx as nx
-from networkx.utils.decorators import py_random_state
+from networkx.utils.decorators import py_random_state, not_implemented_for
__all__ = ["randomized_partitioning", "one_exchange"]
-@nx.not_implemented_for("directed", "multigraph")
+@not_implemented_for("directed", "multigraph")
@py_random_state(1)
def randomized_partitioning(G, seed=None, p=0.5, weight=None):
"""Compute a random partitioning of the graph nodes and its cut value.
@@ -48,7 +48,7 @@ def _swap_node_partition(cut, node):
return cut - {node} if node in cut else cut.union({node})
-@nx.not_implemented_for("directed", "multigraph")
+@not_implemented_for("directed", "multigraph")
@py_random_state(2)
def one_exchange(G, initial_cut=None, seed=None, weight=None):
"""Compute a partitioning of the graphs nodes and the corresponding cut value.
diff --git a/networkx/algorithms/approximation/tests/test_traveling_salesman.py b/networkx/algorithms/approximation/tests/test_traveling_salesman.py
new file mode 100644
index 00000000..78d93351
--- /dev/null
+++ b/networkx/algorithms/approximation/tests/test_traveling_salesman.py
@@ -0,0 +1,399 @@
+"""Unit tests for the traveling_salesman module."""
+import pytest
+import random
+import networkx as nx
+import networkx.algorithms.approximation as nx_app
+
+pairwise = nx.utils.pairwise
+
+
+def test_christofides_hamiltonian():
+ random.seed(42)
+ G = nx.complete_graph(20)
+ for (u, v) in G.edges():
+ G[u][v]["weight"] = random.randint(0, 10)
+
+ H = nx.Graph()
+ H.add_edges_from(pairwise(nx_app.christofides(G)))
+ H.remove_edges_from(nx.find_cycle(H))
+ assert len(H.edges) == 0
+
+ tree = nx.minimum_spanning_tree(G, weight="weight")
+ H = nx.Graph()
+ H.add_edges_from(pairwise(nx_app.christofides(G, tree)))
+ H.remove_edges_from(nx.find_cycle(H))
+ assert len(H.edges) == 0
+
+
+def test_christofides_incomplete_graph():
+ G = nx.complete_graph(10)
+ G.remove_edge(0, 1)
+ pytest.raises(nx.NetworkXError, nx_app.christofides, G)
+
+
+def test_christofides_ignore_selfloops():
+ G = nx.complete_graph(5)
+ G.add_edge(3, 3)
+ cycle = nx_app.christofides(G)
+ assert len(cycle) - 1 == len(G) == len(set(cycle))
+
+
+# set up graphs for other tests
+@classmethod
+def _setup_class(cls):
+ cls.DG = nx.DiGraph()
+ cls.DG.add_weighted_edges_from(
+ {
+ ("A", "B", 3),
+ ("A", "C", 17),
+ ("A", "D", 14),
+ ("B", "A", 3),
+ ("B", "C", 12),
+ ("B", "D", 16),
+ ("C", "A", 13),
+ ("C", "B", 12),
+ ("C", "D", 4),
+ ("D", "A", 14),
+ ("D", "B", 15),
+ ("D", "C", 2),
+ }
+ )
+ cls.DG_cycle = ["D", "C", "B", "A", "D"]
+ cls.DG_cost = 31.0
+
+ cls.DG2 = nx.DiGraph()
+ cls.DG2.add_weighted_edges_from(
+ {
+ ("A", "B", 3),
+ ("A", "C", 17),
+ ("A", "D", 14),
+ ("B", "A", 30),
+ ("B", "C", 2),
+ ("B", "D", 16),
+ ("C", "A", 33),
+ ("C", "B", 32),
+ ("C", "D", 34),
+ ("D", "A", 14),
+ ("D", "B", 15),
+ ("D", "C", 2),
+ }
+ )
+ cls.DG2_cycle = ["D", "A", "B", "C", "D"]
+ cls.DG2_cost = 53.0
+
+ cls.unweightedUG = nx.complete_graph(5, nx.Graph())
+ cls.unweightedDG = nx.complete_graph(5, nx.DiGraph())
+
+ cls.incompleteUG = nx.Graph()
+ cls.incompleteUG.add_weighted_edges_from({(0, 1, 1), (1, 2, 3)})
+ cls.incompleteDG = nx.DiGraph()
+ cls.incompleteDG.add_weighted_edges_from({(0, 1, 1), (1, 2, 3)})
+
+ cls.UG = nx.Graph()
+ cls.UG.add_weighted_edges_from(
+ {
+ ("A", "B", 3),
+ ("A", "C", 17),
+ ("A", "D", 14),
+ ("B", "C", 12),
+ ("B", "D", 16),
+ ("C", "D", 4),
+ }
+ )
+ cls.UG_cycle = ["D", "C", "B", "A", "D"]
+ cls.UG_cost = 33.0
+
+ cls.UG2 = nx.Graph()
+ cls.UG2.add_weighted_edges_from(
+ {
+ ("A", "B", 1),
+ ("A", "C", 15),
+ ("A", "D", 5),
+ ("B", "C", 16),
+ ("B", "D", 8),
+ ("C", "D", 3),
+ }
+ )
+ cls.UG2_cycle = ["D", "C", "B", "A", "D"]
+ cls.UG2_cost = 25.0
+
+
+def validate_solution(soln, cost, exp_soln, exp_cost):
+ assert soln == exp_soln
+ assert cost == exp_cost
+
+
+def validate_symmetric_solution(soln, cost, exp_soln, exp_cost):
+ assert soln == exp_soln or soln == exp_soln[::-1]
+ assert cost == exp_cost
+
+
+class TestGreedyTSP:
+ setup_class = _setup_class
+
+ def test_greedy(self):
+ cycle = nx_app.greedy_tsp(self.DG, source="D")
+ cost = sum(self.DG[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, ["D", "C", "B", "A", "D"], 31.0)
+
+ cycle = nx_app.greedy_tsp(self.DG2, source="D")
+ cost = sum(self.DG2[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, ["D", "C", "B", "A", "D"], 78.0)
+
+ cycle = nx_app.greedy_tsp(self.UG, source="D")
+ cost = sum(self.UG[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, ["D", "C", "B", "A", "D"], 33.0)
+
+ cycle = nx_app.greedy_tsp(self.UG2, source="D")
+ cost = sum(self.UG2[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, ["D", "C", "A", "B", "D"], 27.0)
+
+ def test_not_complete_graph(self):
+ pytest.raises(nx.NetworkXError, nx_app.greedy_tsp, self.incompleteUG)
+ pytest.raises(nx.NetworkXError, nx_app.greedy_tsp, self.incompleteDG)
+
+ def test_not_weighted_graph(self):
+ nx_app.greedy_tsp(self.unweightedUG)
+ nx_app.greedy_tsp(self.unweightedDG)
+
+ def test_two_nodes(self):
+ G = nx.Graph()
+ G.add_weighted_edges_from({(1, 2, 1)})
+ cycle = nx_app.greedy_tsp(G)
+ cost = sum(G[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, [1, 2, 1], 2)
+
+ def test_ignore_selfloops(self):
+ G = nx.complete_graph(5)
+ G.add_edge(3, 3)
+ cycle = nx_app.greedy_tsp(G)
+ assert len(cycle) - 1 == len(G) == len(set(cycle))
+
+
+class TestSimulatedAnnealingTSP:
+ setup_class = _setup_class
+ tsp = staticmethod(nx_app.simulated_annealing_tsp)
+
+ def test_simulated_annealing_directed(self):
+ cycle = self.tsp(self.DG, "greedy", source="D", seed=42)
+ cost = sum(self.DG[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, self.DG_cycle, self.DG_cost)
+
+ initial_sol = ["D", "B", "A", "C", "D"]
+ cycle = self.tsp(self.DG, initial_sol, source="D", seed=42)
+ cost = sum(self.DG[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, self.DG_cycle, self.DG_cost)
+
+ initial_sol = ["D", "A", "C", "B", "D"]
+ cycle = self.tsp(self.DG, initial_sol, move="1-0", source="D", seed=42)
+ cost = sum(self.DG[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, self.DG_cycle, self.DG_cost)
+
+ cycle = self.tsp(self.DG2, "greedy", source="D", seed=42)
+ cost = sum(self.DG2[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, self.DG2_cycle, self.DG2_cost)
+
+ cycle = self.tsp(self.DG2, "greedy", move="1-0", source="D", seed=42)
+ cost = sum(self.DG2[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, self.DG2_cycle, self.DG2_cost)
+
+ def test_simulated_annealing_undirected(self):
+ cycle = self.tsp(self.UG, "greedy", source="D", seed=42)
+ cost = sum(self.UG[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, self.UG_cycle, self.UG_cost)
+
+ cycle = self.tsp(self.UG2, "greedy", source="D", seed=42)
+ cost = sum(self.UG2[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_symmetric_solution(cycle, cost, self.UG2_cycle, self.UG2_cost)
+
+ cycle = self.tsp(self.UG2, "greedy", move="1-0", source="D", seed=42)
+ cost = sum(self.UG2[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_symmetric_solution(cycle, cost, self.UG2_cycle, self.UG2_cost)
+
+ def test_not_complete_graph(self):
+ pytest.raises(
+ nx.NetworkXError,
+ self.tsp,
+ self.incompleteUG,
+ "greedy",
+ source=0,
+ )
+ pytest.raises(
+ nx.NetworkXError,
+ self.tsp,
+ self.incompleteDG,
+ "greedy",
+ source=0,
+ )
+
+ def test_ignore_selfloops(self):
+ G = nx.complete_graph(5)
+ G.add_edge(3, 3)
+ cycle = self.tsp(G, "greedy")
+ assert len(cycle) - 1 == len(G) == len(set(cycle))
+
+ def test_not_weighted_graph(self):
+ self.tsp(self.unweightedUG, "greedy")
+ self.tsp(self.unweightedDG, "greedy")
+
+ def test_two_nodes(self):
+ G = nx.Graph()
+ G.add_weighted_edges_from({(1, 2, 1)})
+
+ cycle = self.tsp(G, "greedy", source=1, seed=42)
+ cost = sum(G[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, [1, 2, 1], 2)
+
+ cycle = self.tsp(G, [1, 2, 1], source=1, seed=42)
+ cost = sum(G[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ validate_solution(cycle, cost, [1, 2, 1], 2)
+
+ def test_failure_of_costs_too_high_when_iterations_low(self):
+ # Simulated Annealing Version:
+ # set number of moves low and alpha high
+ cycle = self.tsp(
+ self.DG2, "greedy", source="D", move="1-0", alpha=1, N_inner=1, seed=42
+ )
+ cost = sum(self.DG2[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ print(cycle, cost)
+ assert cost > self.DG2_cost
+
+ # Try with an incorrect initial guess
+ initial_sol = ["D", "A", "B", "C", "D"]
+ cycle = self.tsp(
+ self.DG,
+ initial_sol,
+ source="D",
+ move="1-0",
+ alpha=0.1,
+ N_inner=1,
+ max_iterations=1,
+ seed=42,
+ )
+ cost = sum(self.DG[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ print(cycle, cost)
+ assert cost > self.DG_cost
+
+
+class TestThresholdAcceptingTSP(TestSimulatedAnnealingTSP):
+ tsp = staticmethod(nx_app.threshold_accepting_tsp)
+
+ def test_failure_of_costs_too_high_when_iterations_low(self):
+ # Threshold Version:
+ # set number of moves low and number of iterations low
+ cycle = self.tsp(
+ self.DG2,
+ "greedy",
+ source="D",
+ move="1-0",
+ N_inner=1,
+ max_iterations=1,
+ seed=4,
+ )
+ cost = sum(self.DG2[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ assert cost > self.DG2_cost
+
+ # set threshold too low
+ initial_sol = ["D", "A", "B", "C", "D"]
+ cycle = self.tsp(
+ self.DG,
+ initial_sol,
+ source="D",
+ move="1-0",
+ threshold=-3,
+ seed=42,
+ )
+ cost = sum(self.DG[n][nbr]["weight"] for n, nbr in pairwise(cycle))
+ assert cost > self.DG_cost
+
+
+# Tests for function traveling_salesman_problem
+def test_TSP_method():
+ G = nx.cycle_graph(9)
+ G[4][5]["weight"] = 10
+
+ def my_tsp_method(G, weight):
+ return nx_app.simulated_annealing_tsp(G, "greedy", weight, source=4, seed=1)
+
+ path = nx_app.traveling_salesman_problem(G, method=my_tsp_method, cycle=False)
+ print(path)
+ assert path == [4, 3, 2, 1, 0, 8, 7, 6, 5]
+
+
+def test_TSP_unweighted():
+ G = nx.cycle_graph(9)
+ path = nx_app.traveling_salesman_problem(G, nodes=[3, 6], cycle=False)
+ assert path in ([3, 4, 5, 6], [6, 5, 4, 3])
+
+ cycle = nx_app.traveling_salesman_problem(G, nodes=[3, 6])
+ assert cycle in ([3, 4, 5, 6, 5, 4, 3], [6, 5, 4, 3, 4, 5, 6])
+
+
+def test_TSP_weighted():
+ G = nx.cycle_graph(9)
+ G[0][1]["weight"] = 2
+ G[1][2]["weight"] = 2
+ G[2][3]["weight"] = 2
+ G[3][4]["weight"] = 4
+ G[4][5]["weight"] = 5
+ G[5][6]["weight"] = 4
+ G[6][7]["weight"] = 2
+ G[7][8]["weight"] = 2
+ G[8][0]["weight"] = 2
+ tsp = nx_app.traveling_salesman_problem
+
+ # path between 3 and 6
+ expected_paths = ([3, 2, 1, 0, 8, 7, 6], [6, 7, 8, 0, 1, 2, 3])
+ # cycle between 3 and 6
+ expected_cycles = (
+ [3, 2, 1, 0, 8, 7, 6, 7, 8, 0, 1, 2, 3],
+ [6, 7, 8, 0, 1, 2, 3, 2, 1, 0, 8, 7, 6],
+ )
+ # path through all nodes
+ expected_tourpaths = (
+ [5, 6, 7, 8, 0, 1, 2, 3, 4],
+ [4, 3, 2, 1, 0, 8, 7, 6, 5],
+ )
+
+ # Check default method
+ cycle = tsp(G, nodes=[3, 6], weight="weight")
+ assert cycle in expected_cycles
+
+ path = tsp(G, nodes=[3, 6], weight="weight", cycle=False)
+ assert path in expected_paths
+
+ tourpath = tsp(G, weight="weight", cycle=False)
+ assert tourpath in expected_tourpaths
+
+ # Check all methods
+ methods = [
+ nx_app.christofides,
+ nx_app.greedy_tsp,
+ lambda G, wt: nx_app.simulated_annealing_tsp(G, "greedy", weight=wt),
+ lambda G, wt: nx_app.threshold_accepting_tsp(G, "greedy", weight=wt),
+ ]
+ for method in methods:
+ cycle = tsp(G, nodes=[3, 6], weight="weight", method=method)
+ assert cycle in expected_cycles
+
+ path = tsp(G, nodes=[3, 6], weight="weight", method=method, cycle=False)
+ assert path in expected_paths
+
+ tourpath = tsp(G, weight="weight", method=method, cycle=False)
+ assert tourpath in expected_tourpaths
+
+
+def test_TSP_incomplete_graph_short_path():
+ G = nx.cycle_graph(9)
+ G.add_edges_from([(4, 9), (9, 10), (10, 11), (11, 0)])
+ G[4][5]["weight"] = 5
+
+ cycle = nx_app.traveling_salesman_problem(G)
+ print(cycle)
+ assert len(cycle) == 17 and len(set(cycle)) == 12
+
+ # make sure that cutting one edge out of complete graph formulation
+ # cuts out many edges out of the path of the TSP
+ path = nx_app.traveling_salesman_problem(G, cycle=False)
+ print(path)
+ assert len(path) == 13 and len(set(path)) == 12
diff --git a/networkx/algorithms/approximation/traveling_salesman.py b/networkx/algorithms/approximation/traveling_salesman.py
new file mode 100644
index 00000000..3779588f
--- /dev/null
+++ b/networkx/algorithms/approximation/traveling_salesman.py
@@ -0,0 +1,846 @@
+"""
+=================================
+Travelling Salesman Problem (TSP)
+=================================
+
+Implementation of approximate algorithms
+for solving and approximating the TSP problem.
+
+Categories of algorithms which are implemented:
+
+- Christofides (provides a 3/2-approximation of TSP)
+- Greedy
+- Simulated Annealing (SA)
+- Threshold Accepting (TA)
+
+The Travelling Salesman Problem tries to find, given the weight
+(distance) between all points where a salesman has to visit, the
+route so that:
+
+- The total distance (cost) which the salesman travels is minimized.
+- The salesman returns to the starting point.
+- Note that for a complete graph, the salesman visits each point once.
+
+The function `travelling_salesman_problem` allows for incomplete
+graphs by finding all-pairs shortest paths, effectively converting
+the problem to a complete graph problem. It calls one of the
+approximate methods on that problem and then converts the result
+back to the original graph using the previously found shortest paths.
+
+TSP is an NP-hard problem in combinatorial optimization,
+important in operations research and theoretical computer science.
+
+http://en.wikipedia.org/wiki/Travelling_salesman_problem
+"""
+import math
+import networkx as nx
+from networkx.utils import py_random_state, not_implemented_for, pairwise
+
+__all__ = [
+ "traveling_salesman_problem",
+ "christofides",
+ "greedy_tsp",
+ "simulated_annealing_tsp",
+ "threshold_accepting_tsp",
+]
+
+
+def swap_two_nodes(soln, seed):
+ """Swap two nodes in `soln` to give a neighbor solution.
+
+ Parameters
+ ----------
+ soln : list of nodes
+ Current cycle of nodes
+
+ seed : integer, random_state, or None (default)
+ Indicator of random number generation state.
+ See :ref:`Randomness<randomness>`.
+
+ Returns
+ -------
+ list
+ The solution after move is applied. (A neighbor solution.)
+
+ Notes
+ -----
+ This function assumes that the incoming list `soln` is a cycle
+ (that the first and last element are the same) and also that
+ we don't want any move to change the first node in the list
+ (and thus not the last node either).
+
+ The input list is changed as well as returned. Make a copy if needed.
+
+ See Also
+ --------
+ move_one_node
+ """
+ a, b = seed.sample(range(1, len(soln) - 1), k=2)
+ soln[a], soln[b] = soln[b], soln[a]
+ return soln
+
+
+def move_one_node(soln, seed):
+ """Move one node to another position to give a neighbor solution.
+
+ The node to move and the position to move to are chosen randomly.
+ The first and last nodes are left untouched as soln must be a cycle
+ starting at that node.
+
+ Parameters
+ ----------
+ soln : list of nodes
+ Current cycle of nodes
+
+ seed : integer, random_state, or None (default)
+ Indicator of random number generation state.
+ See :ref:`Randomness<randomness>`.
+
+ Returns
+ -------
+ list
+ The solution after move is applied. (A neighbor solution.)
+
+ Notes
+ -----
+ This function assumes that the incoming list `soln` is a cycle
+ (that the first and last element are the same) and also that
+ we don't want any move to change the first node in the list
+ (and thus not the last node either).
+
+ The input list is changed as well as returned. Make a copy if needed.
+
+ See Also
+ --------
+ swap_two_nodes
+ """
+ a, b = seed.sample(range(1, len(soln) - 1), k=2)
+ soln.insert(b, soln.pop(a))
+ return soln
+
+
+@not_implemented_for("directed")
+def christofides(G, weight="weight", tree=None):
+ """Approximate a solution of the traveling salesman problem
+
+ Compute a 3/2-approximation of the traveling salesman problem
+ in a complete undirected graph using Christofides [1]_ algorithm.
+
+ Parameters
+ ----------
+ G : Graph
+ `G` should be a complete weighted undirected graph.
+ The distance between all pairs of nodes should be included.
+
+ weight : string, optional (default="weight")
+ Edge data key corresponding to the edge weight.
+ If any edge does not have this attribute the weight is set to 1.
+
+ tree : NetworkX graph or None (default: None)
+ A minimum spanning tree of G. Or, if None, the minimum spanning
+ tree is computed using :func:`networkx.minimum_spanning_tree`
+
+ Returns
+ -------
+ list
+ List of nodes in `G` along a cycle with a 3/2-approximation of
+ the minimal Hamiltonian cycle.
+
+ References
+ ----------
+ .. [1] Christofides, Nicos. "Worst-case analysis of a new heuristic for
+ the travelling salesman problem." No. RR-388. Carnegie-Mellon Univ
+ Pittsburgh Pa Management Sciences Research Group, 1976.
+ """
+ # Remove selfloops if necessary
+ loop_nodes = nx.nodes_with_selfloops(G)
+ try:
+ node = next(loop_nodes)
+ except StopIteration:
+ pass
+ else:
+ G = G.copy()
+ G.remove_edge(node, node)
+ G.remove_edges_from((n, n) for n in loop_nodes)
+ # Check that G is a complete graph
+ N = len(G) - 1
+ # This check ignores selfloops which is what we want here.
+ if any(len(nbrdict) != N for n, nbrdict in G.adj.items()):
+ raise nx.NetworkXError("G must be a complete graph.")
+
+ if tree is None:
+ tree = nx.minimum_spanning_tree(G, weight=weight)
+ L = G.copy()
+ L.remove_nodes_from([v for v, degree in tree.degree if not (degree % 2)])
+ MG = nx.MultiGraph()
+ MG.add_edges_from(tree.edges)
+ edges = nx.min_weight_matching(L, maxcardinality=True, weight=weight)
+ MG.add_edges_from(edges)
+ return _shortcutting(nx.eulerian_circuit(MG))
+
+
+def _shortcutting(circuit):
+ """Remove duplicate nodes in the path"""
+ nodes = []
+ for u, v in circuit:
+ if v in nodes:
+ continue
+ if not nodes:
+ nodes.append(u)
+ nodes.append(v)
+ nodes.append(nodes[0])
+ return nodes
+
+
+def traveling_salesman_problem(G, weight="weight", nodes=None, cycle=True, method=None):
+ """Find the shortest path in `G` connecting specified nodes
+
+ This function allows approximate solution to the traveling salesman
+ problem on networks that are not complete graphs and/or where the
+ salesman does not need to visit all nodes.
+
+ This function proceeds in two steps. First, it creates a complete
+ graph using the all-pairs shortest_paths between nodes in `nodes`.
+ Edge weights in the new graph are the lengths of the paths
+ between each pair of nodes in the original graph.
+ Second, an algorithm (default: `christofides`) is used to approximate
+ the minimal Hamiltonian cycle on this new graph. The available
+ algorithms are:
+ - christofides
+ - greedy_tsp
+ - simulated_annealing_tsp
+ - threshold_accepting_tsp
+
+ Once the Hamiltonian Cycle is found, this function post-processes to
+ accommodate the structure of the original graph. If `cycle` is ``False``,
+ the biggest weight edge is removed to make a Hamiltonian path.
+ Then each edge on the new complete graph used for that analysis is
+ replaced by the shortest_path between those nodes on the original graph.
+
+ Parameters
+ ----------
+ G : NetworkX graph
+ Undirected possibly weighted graph
+
+ nodes : collection of nodes (default=G.nodes)
+ collection (list, set, etc.) of nodes to visit
+
+ weight : string, optional (default="weight")
+ Edge data key corresponding to the edge weight.
+ If any edge does not have this attribute the weight is set to 1.
+
+ cycle : bool (default: True)
+ Indicates whether a cycle should be returned, or a path.
+ Note: the cycle is the approximate minimal cycle.
+ The path simply removes the biggest edge in that cycle.
+
+ method : function (default: None)
+ A function that returns a cycle on all nodes and approximates
+ the solution to the traveling salesman problem on a complete
+ graph. The returned cycle is then used to find a corresponding
+ solution on `G`. `method` should be callable; take inputs
+ `G`, and `weight`; and return a list of nodes along the cycle.
+
+ Provided options include :func:`christofides`, :func:`greedy_tsp`,
+ :func:`simulated_annealing_tsp` and :func:`threshold_accepting_tsp`.
+
+ If `method is None`: use :func:`christofides` for undirected `G` and
+ :func:`threshold_accepting_tsp` for directed `G`.
+
+ To specify parameters for these provided functions, construct lambda
+ functions that state the specific value. `method` must have 2 inputs.
+ (See examples).
+
+ Returns
+ -------
+ list
+ List of nodes in `G` along a path with a 3/2-approximation of the minimal
+ path through `nodes`.
+
+ Examples
+ --------
+ >>> tsp = nx.approximation.traveling_salesman_problem
+ >>> G = nx.cycle_graph(9)
+ >>> G[4][5]["weight"] = 5 # all other weights are 1
+ >>> tsp(G, nodes=[3, 6])
+ [3, 2, 1, 0, 8, 7, 6, 7, 8, 0, 1, 2, 3]
+ >>> path = tsp(G, cycle=False)
+ >>> path in ([4, 3, 2, 1, 0, 8, 7, 6, 5], [5, 6, 7, 8, 0, 1, 2, 3, 4])
+ True
+
+ Build (curry) your own function to provide parameter values to the methods.
+
+ >>> SA_tsp = nx.approximation.simulated_annealing_tsp
+ >>> method = lambda G, wt: SA_tsp(G, "greedy", weight=wt, temp=500)
+ >>> path = tsp(G, cycle=False, method=method)
+ >>> path in ([4, 3, 2, 1, 0, 8, 7, 6, 5], [5, 6, 7, 8, 0, 1, 2, 3, 4])
+ True
+
+ """
+ if method is None:
+ if G.is_directed():
+
+ def threshold_tsp(G, weight):
+ return threshold_accepting_tsp(G, "greedy", weight)
+
+ method = threshold_tsp
+ else:
+ method = christofides
+ if nodes is None:
+ nodes = list(G.nodes)
+
+ dist = {}
+ path = {}
+ for n, (d, p) in nx.all_pairs_dijkstra(G, weight=weight):
+ dist[n] = d
+ path[n] = p
+
+ GG = nx.Graph()
+ for u in nodes:
+ for v in nodes:
+ if u == v:
+ continue
+ GG.add_edge(u, v, weight=dist[u][v])
+ best_GG = method(GG, weight)
+
+ if not cycle:
+ # find and remove the biggest edge
+ biggest_edge = None
+ length_biggest = float("-inf")
+ (u, v) = max(pairwise(best_GG), key=lambda x: dist[x[0]][x[1]])
+ pos = best_GG.index(u) + 1
+ while best_GG[pos] != v:
+ pos = best_GG[pos:].index(u) + 1
+ best_GG = best_GG[pos:-1] + best_GG[:pos]
+
+ best_path = []
+ for u, v in pairwise(best_GG):
+ best_path.extend(path[u][v][:-1])
+ best_path.append(v)
+ return best_path
+
+
+def greedy_tsp(G, weight="weight", source=None):
+ """Return a low cost cycle starting at `source` and its cost.
+
+ This approximates a solution to the traveling salesman problem.
+ It finds a cycle of all the nodes that a salesman can visit in order
+ to visit many nodes while minimizing total distance.
+ It uses a simple greedy algorithm.
+ In essence, this function returns a large cycle given a source point
+ for which the total cost of the cycle is minimized.
+
+ Parameters
+ ----------
+ G : Graph
+ The Graph should be a complete weighted undirected graph.
+ The distance between all pairs of nodes should be included.
+
+ weight : string, optional (default="weight")
+ Edge data key corresponding to the edge weight.
+ If any edge does not have this attribute the weight is set to 1.
+
+ source : node, optional (default: first node in list(G))
+ Starting node. If None, defaults to ``next(iter(G))``
+
+ Returns
+ -------
+ cycle : list of nodes
+ Returns the cycle (list of nodes) that a salesman
+ can follow to minimize total weight of the trip.
+
+ Raises
+ ------
+ NetworkXError
+ If `G` is not complete, the algorithm raises an exception.
+
+ Examples
+ --------
+ >>> from networkx.algorithms import approximation as approx
+ >>> G = nx.DiGraph()
+ >>> G.add_weighted_edges_from({
+ ... ("A", "B", 3), ("A", "C", 17), ("A", "D", 14), ("B", "A", 3),
+ ... ("B", "C", 12), ("B", "D", 16), ("C", "A", 13),("C", "B", 12),
+ ... ("C", "D", 4), ("D", "A", 14), ("D", "B", 15), ("D", "C", 2)
+ ... })
+ >>> cycle = approx.greedy_tsp(G, source="D")
+ >>> cost = sum(G[n][nbr]["weight"] for n, nbr in nx.utils.pairwise(cycle))
+ >>> cycle
+ ['D', 'C', 'B', 'A', 'D']
+ >>> cost
+ 31
+
+ Notes
+ -----
+ This implementation of a greedy algorithm is based on the following:
+
+ - The algorithm adds a node to the solution at every iteration.
+ - The algorithm selects a node not already in the cycle whose connection
+ to the previous node adds the least cost to the cycle.
+
+ A greedy algorithm does not always give the best solution.
+ However, it can construct a first feasible solution which can
+ be passed as a parameter to an iterative improvement algorithm such
+ as Simulated Annealing, or Threshold Accepting.
+
+ Time complexity: It has a running time $O(|V|^2)$
+ """
+ # Check that G is a complete graph
+ N = len(G) - 1
+ # This check ignores selfloops which is what we want here.
+ if any(len(nbrdict) - (n in nbrdict) != N for n, nbrdict in G.adj.items()):
+ raise nx.NetworkXError("G must be a complete graph.")
+
+ if source is None:
+ source = nx.utils.arbitrary_element(G)
+
+ if G.number_of_nodes() == 2:
+ neighbor = next(G.neighbors(source))
+ return [source, neighbor, source]
+
+ nodeset = set(G)
+ nodeset.remove(source)
+ cycle = [source]
+ next_node = source
+ while nodeset:
+ nbrdict = G[next_node]
+ next_node = min(nodeset, key=lambda n: nbrdict[n].get(weight, 1))
+ cycle.append(next_node)
+ nodeset.remove(next_node)
+ cycle.append(cycle[0])
+ return cycle
+
+
+@py_random_state(9)
+def simulated_annealing_tsp(
+ G,
+ init_cycle,
+ weight="weight",
+ source=None,
+ temp=100,
+ move="1-1",
+ max_iterations=10,
+ N_inner=100,
+ alpha=0.01,
+ seed=None,
+):
+ """Returns an approximate solution to the traveling salesman problem.
+
+ This function uses simulated annealing to approximate the minimal cost
+ cycle through the nodes. Starting from a suboptimal solution, simulated
+ annealing perturbs that solution, occasionally accepting changes that make
+ the solution worse to escape from a locally optimal solution. The chance
+ of accepting such changes decreases over the iterations to encourage
+ an optimal result. In summary, the function returns a cycle starting
+ at `source` for which the total cost is minimized. It also returns the cost.
+
+ The chance of accepting a proposed change is related to a parameter called
+ the temperature (annealing has a physical analogue of steel hardening
+ as it cools). As the temperature is reduced, the chance of moves that
+ increase cost goes down.
+
+ Parameters
+ ----------
+ G : Graph
+ `G` should be a complete weighted undirected graph.
+ The distance between all pairs of nodes should be included.
+
+ init_cycle : list of all nodes or "greedy"
+ The initial solution (a cycle through all nodes).
+ Usually you should use `greedy_tsp(G, weight)`.
+ But you can start with `list(G)` or the final result
+ of `simulated_annealing_tsp` when doing `threshold_accepting_tsp`.
+
+ This argument is required. A shortcut if you don't want to think
+ about it is to use the string "greedy" which calls `greedy_tsp`.
+
+ weight : string, optional (default="weight")
+ Edge data key corresponding to the edge weight.
+ If any edge does not have this attribute the weight is set to 1.
+
+ source : node, optional (default: first node in list(G))
+ Starting node. If None, defaults to ``next(iter(G))``
+
+ temp : int, optional (default=100)
+ The algorithm's temperature parameter. It represents the initial
+ value of temperature
+
+ move : "1-1" or "1-0" or function, optional (default="1-1")
+ Indicator of what move to use when finding new trial solutions.
+ Strings indicate two special built-in moves:
+ - "1-1": 1-1 exchange which transposes the position
+ of two elements of the current solution.
+ The function called is :func:`swap_two_nodes`.
+ For example if we apply 1-1 exchange in the solution
+ ``A = [3, 2, 1, 4, 3]``
+ we can get the following by the transposition of 1 and 4 elements:
+ ``A' = [3, 2, 4, 1, 3]``
+ - "1-0": 1-0 exchange which moves an node in the solution
+ to a new position.
+ The function called is :func:`move_one_node`.
+ For example if we apply 1-0 exchange in the solution
+ ``A = [3, 2, 1, 4, 3]``
+ we can transfer the fourth element to the second position:
+ ``A' = [3, 4, 2, 1, 3]``
+
+ You may provide your own functions to enact a move from
+ one solution to a neighbor solution. The function must take
+ the solution as input along with a `seed` input to control
+ random number generation (see the `seed` input here).
+ Your function should maintain the solution as a cycle with
+ equal first and last node and all others appearing once.
+ Your function should return the new solution.
+
+ max_iterations : int, optional (default=10)
+ Declared done when this number of consecutive iterations of
+ the outer loop occurs without any change in the best cost solution.
+
+ N_inner : int, optional (default=100)
+ The number of iterations of the inner loop.
+
+ alpha : float between (0, 1), optional (default=0.01)
+ Percentage of temperature decrease in each iteration
+ of outer loop
+
+ seed : integer, random_state, or None (default)
+ Indicator of random number generation state.
+ See :ref:`Randomness<randomness>`.
+
+ Returns
+ -------
+ cycle : list of nodes
+ Returns the cycle (list of nodes) that a salesman
+ can follow to minimize total weight of the trip.
+
+ Raises
+ ------
+ NetworkXError
+ If `G` is not complete the algorithm raises an exception.
+
+ Examples
+ --------
+ >>> from networkx.algorithms import approximation as approx
+ >>> G = nx.DiGraph()
+ >>> G.add_weighted_edges_from({
+ ... ("A", "B", 3), ("A", "C", 17), ("A", "D", 14), ("B", "A", 3),
+ ... ("B", "C", 12), ("B", "D", 16), ("C", "A", 13),("C", "B", 12),
+ ... ("C", "D", 4), ("D", "A", 14), ("D", "B", 15), ("D", "C", 2)
+ ... })
+ >>> cycle = approx.simulated_annealing_tsp(G, "greedy", source="D")
+ >>> cost = sum(G[n][nbr]["weight"] for n, nbr in nx.utils.pairwise(cycle))
+ >>> cycle
+ ['D', 'C', 'B', 'A', 'D']
+ >>> cost
+ 31
+ >>> incycle = ["D", "B", "A", "C", "D"]
+ >>> cycle = approx.simulated_annealing_tsp(G, incycle, source="D")
+ >>> cost = sum(G[n][nbr]["weight"] for n, nbr in nx.utils.pairwise(cycle))
+ >>> cycle
+ ['D', 'C', 'B', 'A', 'D']
+ >>> cost
+ 31
+
+ Notes
+ -----
+ Simulated Annealing is a metaheuristic local search algorithm.
+ The main characteristic of this algorithm is that it accepts
+ even solutions which lead to the increase of the cost in order
+ to escape from low quality local optimal solutions.
+
+ This algorithm needs an initial solution. If not provided, it is
+ constructed by a simple greedy algorithm. At every iteration, the
+ algorithm selects thoughtfully a neighbor solution.
+ Consider $c(x)$ cost of current solution and $c(x')$ cost of a
+ neighbor solution.
+ If $c(x') - c(x) <= 0$ then the neighbor solution becomes the current
+ solution for the next iteration. Otherwise, the algorithm accepts
+ the neighbor solution with probability $p = exp - ([c(x') - c(x)] / temp)$.
+ Otherwise the current solution is retained.
+
+ `temp` is a parameter of the algorithm and represents temperature.
+
+ Time complexity:
+ For $N_i$ iterations of the inner loop and $N_o$ iterations of the
+ outer loop, this algorithm has running time $O(N_i * N_o * |V|)$.
+
+ For more information and how the algorithm is inspired see:
+ http://en.wikipedia.org/wiki/Simulated_annealing
+ """
+ if move == "1-1":
+ move = swap_two_nodes
+ elif move == "1-0":
+ move = move_one_node
+ if init_cycle == "greedy":
+ # Construct an initial solution using a greedy algorithm.
+ cycle = greedy_tsp(G, weight=weight, source=source)
+ if G.number_of_nodes() == 2:
+ return cycle
+
+ else:
+ cycle = list(init_cycle)
+ if source is None:
+ source = cycle[0]
+ elif source != cycle[0]:
+ raise nx.NetworkXError("source must be first node in init_cycle")
+ if cycle[0] != cycle[-1]:
+ raise nx.NetworkXError("init_cycle must be a cycle. (return to start)")
+
+ if len(cycle) - 1 != len(G) or len(set(G.nbunch_iter(cycle))) != len(G):
+ raise nx.NetworkXError("init_cycle should be a cycle over all nodes in G.")
+
+ # Check that G is a complete graph
+ N = len(G) - 1
+ # This check ignores selfloops which is what we want here.
+ if any(len(nbrdict) - (n in nbrdict) != N for n, nbrdict in G.adj.items()):
+ raise nx.NetworkXError("G must be a complete graph.")
+
+ if G.number_of_nodes() == 2:
+ neighbor = next(G.neighbors(source))
+ return [source, neighbor, source]
+
+ # Find the cost of initial solution
+ cost = sum(G[u][v].get(weight, 1) for u, v in pairwise(cycle))
+
+ count = 0
+ best_cycle = cycle.copy()
+ best_cost = cost
+ while count <= max_iterations and temp > 0:
+ count += 1
+ for i in range(N_inner):
+ adj_sol = move(cycle, seed)
+ adj_cost = sum(G[u][v].get(weight, 1) for u, v in pairwise(adj_sol))
+ delta = adj_cost - cost
+ if delta <= 0:
+ # Set current solution the adjacent solution.
+ cycle = adj_sol
+ cost = adj_cost
+
+ if cost < best_cost:
+ count = 0
+ best_cycle = cycle.copy()
+ best_cost = cost
+ else:
+ # Accept even a worse solution with probability p.
+ p = math.exp(-delta / temp)
+ if p >= seed.random():
+ cycle = adj_sol
+ cost = adj_cost
+ temp -= temp * alpha
+
+ return best_cycle
+
+
+@py_random_state(9)
+def threshold_accepting_tsp(
+ G,
+ init_cycle,
+ weight="weight",
+ source=None,
+ threshold=1,
+ move="1-1",
+ max_iterations=10,
+ N_inner=100,
+ alpha=0.1,
+ seed=None,
+):
+ """Returns an approximate solution to the traveling salesman problem.
+
+ This function uses threshold accepting methods to approximate the minimal cost
+ cycle through the nodes. Starting from a suboptimal solution, threshold
+ accepting methods perturb that solution, accepting any changes that make
+ the solution no worse than increasing by a threshold amount. Improvements
+ in cost are accepted, but so are changes leading to small increases in cost.
+ This allows the solution to leave suboptimal local minima in solution space.
+ The threshold is decreased slowly as iterations proceed helping to ensure
+ an optimum. In summary, the function returns a cycle starting at `source`
+ for which the total cost is minimized.
+
+ Parameters
+ ----------
+ G : Graph
+ `G` should be a complete weighted undirected graph.
+ The distance between all pairs of nodes should be included.
+
+ weight : string, optional (default="weight")
+ Edge data key corresponding to the edge weight.
+ If any edge does not have this attribute the weight is set to 1.
+
+ source : node, optional (default: first node in list(G))
+ Starting node. If None, defaults to ``next(iter(G))``
+
+ threshold : int, optional (default=1)
+ The algorithm's threshold parameter. It represents the initial
+ threshold's value
+
+ move : "1-1" or "1-0" or function, optional (default="1-1")
+ Indicator of what move to use when finding new trial solutions.
+ Strings indicate two special built-in moves:
+ - "1-1": 1-1 exchange which transposes the position
+ of two elements of the current solution.
+ The function called is :func:`swap_two_nodes`.
+ For example if we apply 1-1 exchange in the solution
+ ``A = [3, 2, 1, 4, 3]``
+ we can get the following by the transposition of 1 and 4 elements:
+ ``A' = [3, 2, 4, 1, 3]``
+ - "1-0": 1-0 exchange which moves an node in the solution
+ to a new position.
+ The function called is :func:`move_one_node`.
+ For example if we apply 1-0 exchange in the solution
+ ``A = [3, 2, 1, 4, 3]``
+ we can transfer the fourth element to the second position:
+ ``A' = [3, 4, 2, 1, 3]``
+
+ You may provide your own functions to enact a move from
+ one solution to a neighbor solution. The function must take
+ the solution as input along with a `seed` input to control
+ random number generation (see the `seed` input here).
+ Your function should maintain the solution as a cycle with
+ equal first and last node and all others appearing once.
+ Your function should return the new solution.
+
+ max_iterations : int, optional (default=10)
+ Declared done when this number of consecutive iterations of
+ the outer loop occurs without any change in the best cost solution.
+
+ N_inner : int, optional (default=100)
+ The number of iterations of the inner loop.
+
+ alpha : float between (0, 1), optional (default=0.1)
+ Percentage of threshold decrease when there is at
+ least one acceptance of a neighbor solution.
+ If no inner loop moves are accepted the threshold remains unchanged.
+
+ cycle : list, optional (default=compute using greedy algorithm)
+ The initial solution (a cycle all nodes).
+
+ seed : integer, random_state, or None (default)
+ Indicator of random number generation state.
+ See :ref:`Randomness<randomness>`.
+
+ Returns
+ -------
+ cycle : list of nodes
+ Returns the cycle (list of nodes) that a salesman
+ can follow to minimize total weight of the trip.
+
+ Raises
+ ------
+ NetworkXError
+ If `G` is not complete the algorithm raises an exception.
+
+ Examples
+ --------
+ >>> from networkx.algorithms import approximation as approx
+ >>> G = nx.DiGraph()
+ >>> G.add_weighted_edges_from({
+ ... ("A", "B", 3), ("A", "C", 17), ("A", "D", 14), ("B", "A", 3),
+ ... ("B", "C", 12), ("B", "D", 16), ("C", "A", 13),("C", "B", 12),
+ ... ("C", "D", 4), ("D", "A", 14), ("D", "B", 15), ("D", "C", 2)
+ ... })
+ >>> cycle = approx.threshold_accepting_tsp(G, "greedy", source="D")
+ >>> cost = sum(G[n][nbr]["weight"] for n, nbr in nx.utils.pairwise(cycle))
+ >>> cycle
+ ['D', 'C', 'B', 'A', 'D']
+ >>> cost
+ 31
+ >>> incycle = ["D", "B", "A", "C", "D"]
+ >>> cycle = approx.threshold_accepting_tsp(G, incycle, source="D")
+ >>> cost = sum(G[n][nbr]["weight"] for n, nbr in nx.utils.pairwise(cycle))
+ >>> cycle
+ ['D', 'C', 'B', 'A', 'D']
+ >>> cost
+ 31
+
+ Notes
+ -----
+ Threshold Accepting is a metaheuristic local search algorithm.
+ The main characteristic of this algorithm is that it accepts
+ even solutions which lead to the increase of the cost in order
+ to escape from low quality local optimal solutions.
+
+ This algorithm needs an initial solution. This solution can be
+ constructed by a simple greedy algorithm. At every iteration, it
+ selects thoughtfully a neighbor solution.
+ Consider $c(x)$ cost of current solution and $c(x')$ cost of
+ neighbor solution.
+ If $c(x') - c(x) <= threshold$ then the neighbor solution becomes the current
+ solution for the next iteration, where the threshold is named threshold.
+
+ In comparison to the Simulated Annealing algorithm, the Threshold
+ Accepting algorithm does not accept very low quality solutions
+ (due to the presence of the threshold value). In the case of
+ Simulated Annealing, even a very low quality solution can
+ be accepted with probability $p$.
+
+ Time complexity:
+ It has a running time $O(m * n * |V|)$ where $m$ and $n$ are the number
+ of times the outer and inner loop run respectively.
+
+ For more information and how algorithm is inspired see:
+ http://en.wikipedia.org/wiki/Threshold_accepting
+
+ See Also
+ --------
+ simulated_annealing_tsp
+
+ """
+ if move == "1-1":
+ move = swap_two_nodes
+ elif move == "1-0":
+ move = move_one_node
+ if init_cycle == "greedy":
+ # Construct an initial solution using a greedy algorithm.
+ cycle = greedy_tsp(G, weight=weight, source=source)
+ if G.number_of_nodes() == 2:
+ return cycle
+
+ else:
+ cycle = list(init_cycle)
+ if source is None:
+ source = cycle[0]
+ elif source != cycle[0]:
+ raise nx.NetworkXError("source must be first node in init_cycle")
+ if cycle[0] != cycle[-1]:
+ raise nx.NetworkXError("init_cycle must be a cycle. (return to start)")
+
+ if len(cycle) - 1 != len(G) or len(set(G.nbunch_iter(cycle))) != len(G):
+ raise nx.NetworkXError("init_cycle is not all and only nodes.")
+
+ # Check that G is a complete graph
+ N = len(G) - 1
+ # This check ignores selfloops which is what we want here.
+ if any(len(nbrdict) - (n in nbrdict) != N for n, nbrdict in G.adj.items()):
+ raise nx.NetworkXError("G must be a complete graph.")
+
+ if G.number_of_nodes() == 2:
+ neighbor = list(G.neighbors(source))[0]
+ return [source, neighbor, source]
+
+ # Find the cost of initial solution
+ cost = sum(G[u][v].get(weight, 1) for u, v in pairwise(cycle))
+
+ count = 0
+ best_cycle = cycle.copy()
+ best_cost = cost
+ while count <= max_iterations:
+ count += 1
+ accepted = False
+ for i in range(N_inner):
+ adj_sol = move(cycle, seed)
+ adj_cost = sum(G[u][v].get(weight, 1) for u, v in pairwise(adj_sol))
+ delta = adj_cost - cost
+ if delta <= threshold:
+ accepted = True
+
+ # Set current solution the adjacent solution.
+ cycle = adj_sol
+ cost = adj_cost
+
+ if cost < best_cost:
+ count = 0
+ best_cycle = cycle.copy()
+ best_cost = cost
+ if accepted:
+ threshold -= threshold * alpha
+
+ return best_cycle
diff --git a/networkx/algorithms/matching.py b/networkx/algorithms/matching.py
index 4bcd9236..cbafe8fa 100644
--- a/networkx/algorithms/matching.py
+++ b/networkx/algorithms/matching.py
@@ -1,4 +1,6 @@
"""Functions for computing and verifying matchings in a graph."""
+import networkx as nx
+from networkx.utils import not_implemented_for
from collections import Counter
from itertools import combinations
from itertools import repeat
@@ -8,10 +10,13 @@ __all__ = [
"is_maximal_matching",
"is_perfect_matching",
"max_weight_matching",
+ "min_weight_matching",
"maximal_matching",
]
+@not_implemented_for("multigraph")
+@not_implemented_for("directed")
def maximal_matching(G):
r"""Find a maximal matching in the graph.
@@ -46,7 +51,9 @@ def maximal_matching(G):
def matching_dict_to_set(matching):
- """Converts a dictionary representing a matching (as returned by
+ """Converts matching dict format to matching set format
+
+ Converts a dictionary representing a matching (as returned by
:func:`max_weight_matching`) to a set representing a matching (as
returned by :func:`maximal_matching`).
@@ -66,8 +73,7 @@ def matching_dict_to_set(matching):
def is_matching(G, matching):
- """Decides whether the given set or dictionary represents a valid
- matching in ``G``.
+ """Return True if ``matching`` is a valid matching of ``G``
A *matching* in a graph is a set of edges in which no two distinct
edges share a common endpoint.
@@ -104,8 +110,7 @@ def is_matching(G, matching):
def is_maximal_matching(G, matching):
- """Decides whether the given set or dictionary represents a valid
- maximal matching in ``G``.
+ """Return True if ``matching`` is a maximal matching of ``G``
A *maximal matching* in a graph is a matching in which adding any
edge would cause the set to no longer be a valid matching.
@@ -148,8 +153,7 @@ def is_maximal_matching(G, matching):
def is_perfect_matching(G, matching):
- """Decides whether the given set represents a valid perfect matching in
- ``G``.
+ """Return True if ``matching`` is a perfect matching for ``G``
A *perfect matching* in a graph is a matching in which exactly one edge
is incident upon each vertex.
@@ -183,6 +187,45 @@ def is_perfect_matching(G, matching):
return all(counts[v] == 1 for v in G)
+@not_implemented_for("multigraph")
+@not_implemented_for("directed")
+def min_weight_matching(G, maxcardinality=False, weight="weight"):
+ """Use reciprocal edge weights to find max reciprocal weight matching.
+
+ This method replaces the weights with their reciprocal and
+ then runs :func:`max_weight_matching`.
+ Read the documentation of max_weight_matching for more information.
+
+ Parameters
+ ----------
+ G : NetworkX graph
+ Undirected graph
+
+ maxcardinality: bool, optional (default=False)
+ If maxcardinality is True, compute the maximum-cardinality matching
+ with minimum weight among all maximum-cardinality matchings.
+
+ weight: string, optional (default='weight')
+ Edge data key corresponding to the edge weight.
+ If key not found, uses 1 as weight.
+
+ Returns
+ -------
+ matching : set
+ A minimal weight matching of the graph.
+ """
+ if len(G.edges) == 0:
+ return max_weight_matching(G, maxcardinality, weight)
+ G_edges = G.edges(data=weight, default=1)
+ min_weight = min([w for _, _, w in G_edges])
+ InvG = nx.Graph()
+ edges = ((u, v, 1 / (1 + w - min_weight)) for u, v, w in G_edges)
+ InvG.add_weighted_edges_from(edges, weight=weight)
+ return max_weight_matching(InvG, maxcardinality, weight)
+
+
+@not_implemented_for("multigraph")
+@not_implemented_for("directed")
def max_weight_matching(G, maxcardinality=False, weight="weight"):
"""Compute a maximum-weighted matching of G.
diff --git a/networkx/algorithms/tests/test_matching.py b/networkx/algorithms/tests/test_matching.py
index df9b8aba..5252e1cc 100644
--- a/networkx/algorithms/tests/test_matching.py
+++ b/networkx/algorithms/tests/test_matching.py
@@ -1,4 +1,5 @@
from itertools import permutations
+from pytest import raises
import math
import networkx as nx
@@ -16,12 +17,14 @@ class TestMaxWeightMatching:
"""Empty graph"""
G = nx.Graph()
assert nx.max_weight_matching(G) == set()
+ assert nx.min_weight_matching(G) == set()
def test_trivial2(self):
"""Self loop"""
G = nx.Graph()
G.add_edge(0, 0, weight=100)
assert nx.max_weight_matching(G) == set()
+ assert nx.min_weight_matching(G) == set()
def test_trivial3(self):
"""Single edge"""
@@ -30,6 +33,9 @@ class TestMaxWeightMatching:
assert_edges_equal(
nx.max_weight_matching(G), matching_dict_to_set({0: 1, 1: 0})
)
+ assert_edges_equal(
+ nx.min_weight_matching(G), matching_dict_to_set({0: 1, 1: 0})
+ )
def test_trivial4(self):
"""Small graph"""
@@ -40,6 +46,10 @@ class TestMaxWeightMatching:
nx.max_weight_matching(G),
matching_dict_to_set({"three": "two", "two": "three"}),
)
+ assert_edges_equal(
+ nx.min_weight_matching(G),
+ matching_dict_to_set({"one": "two", "two": "one"}),
+ )
def test_trivial5(self):
"""Path"""
@@ -53,6 +63,12 @@ class TestMaxWeightMatching:
assert_edges_equal(
nx.max_weight_matching(G, 1), matching_dict_to_set({1: 2, 2: 1, 3: 4, 4: 3})
)
+ assert_edges_equal(
+ nx.min_weight_matching(G), matching_dict_to_set({1: 2, 3: 4})
+ )
+ assert_edges_equal(
+ nx.min_weight_matching(G, 1), matching_dict_to_set({1: 2, 3: 4})
+ )
def test_trivial6(self):
"""Small graph with arbitrary weight attribute"""
@@ -63,6 +79,10 @@ class TestMaxWeightMatching:
nx.max_weight_matching(G, weight="abcd"),
matching_dict_to_set({"one": "two", "two": "one"}),
)
+ assert_edges_equal(
+ nx.min_weight_matching(G, weight="abcd"),
+ matching_dict_to_set({"three": "two"}),
+ )
def test_floating_point_weights(self):
"""Floating point weights"""
@@ -74,6 +94,9 @@ class TestMaxWeightMatching:
assert_edges_equal(
nx.max_weight_matching(G), matching_dict_to_set({1: 4, 2: 3, 3: 2, 4: 1})
)
+ assert_edges_equal(
+ nx.min_weight_matching(G), matching_dict_to_set({1: 4, 2: 3, 3: 2, 4: 1})
+ )
def test_negative_weights(self):
"""Negative weights"""
@@ -89,20 +112,25 @@ class TestMaxWeightMatching:
assert_edges_equal(
nx.max_weight_matching(G, 1), matching_dict_to_set({1: 3, 2: 4, 3: 1, 4: 2})
)
+ assert_edges_equal(
+ nx.min_weight_matching(G), matching_dict_to_set({1: 2, 3: 4})
+ )
+ assert_edges_equal(
+ nx.min_weight_matching(G, 1), matching_dict_to_set({1: 2, 3: 4})
+ )
def test_s_blossom(self):
"""Create S-blossom and use it for augmentation:"""
G = nx.Graph()
G.add_weighted_edges_from([(1, 2, 8), (1, 3, 9), (2, 3, 10), (3, 4, 7)])
- assert_edges_equal(
- nx.max_weight_matching(G), matching_dict_to_set({1: 2, 2: 1, 3: 4, 4: 3})
- )
+ answer = matching_dict_to_set({1: 2, 2: 1, 3: 4, 4: 3})
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
G.add_weighted_edges_from([(1, 6, 5), (4, 5, 6)])
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set({1: 6, 2: 3, 3: 2, 4: 5, 5: 4, 6: 1}),
- )
+ answer = matching_dict_to_set({1: 6, 2: 3, 3: 2, 4: 5, 5: 4, 6: 1})
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_s_t_blossom(self):
"""Create S-blossom, relabel as T-blossom, use for augmentation:"""
@@ -110,22 +138,20 @@ class TestMaxWeightMatching:
G.add_weighted_edges_from(
[(1, 2, 9), (1, 3, 8), (2, 3, 10), (1, 4, 5), (4, 5, 4), (1, 6, 3)]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set({1: 6, 2: 3, 3: 2, 4: 5, 5: 4, 6: 1}),
- )
+ answer = matching_dict_to_set({1: 6, 2: 3, 3: 2, 4: 5, 5: 4, 6: 1})
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
+
G.add_edge(4, 5, weight=3)
G.add_edge(1, 6, weight=4)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set({1: 6, 2: 3, 3: 2, 4: 5, 5: 4, 6: 1}),
- )
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
+
G.remove_edge(1, 6)
G.add_edge(3, 6, weight=4)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set({1: 2, 2: 1, 3: 6, 4: 5, 5: 4, 6: 3}),
- )
+ answer = matching_dict_to_set({1: 2, 2: 1, 3: 6, 4: 5, 5: 4, 6: 3})
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_nested_s_blossom(self):
"""Create nested S-blossom, use for augmentation:"""
@@ -146,6 +172,8 @@ class TestMaxWeightMatching:
expected = {frozenset(e) for e in matching_dict_to_set(dict_format)}
answer = {frozenset(e) for e in nx.max_weight_matching(G)}
assert answer == expected
+ answer = {frozenset(e) for e in nx.min_weight_matching(G)}
+ assert answer == expected
def test_nested_s_blossom_relabel(self):
"""Create S-blossom, relabel as S, include in nested S-blossom:"""
@@ -163,10 +191,9 @@ class TestMaxWeightMatching:
(7, 8, 8),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set({1: 2, 2: 1, 3: 4, 4: 3, 5: 6, 6: 5, 7: 8, 8: 7}),
- )
+ answer = matching_dict_to_set({1: 2, 2: 1, 3: 4, 4: 3, 5: 6, 6: 5, 7: 8, 8: 7})
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_nested_s_blossom_expand(self):
"""Create nested S-blossom, augment, expand recursively:"""
@@ -185,10 +212,9 @@ class TestMaxWeightMatching:
(7, 8, 12),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set({1: 2, 2: 1, 3: 5, 4: 6, 5: 3, 6: 4, 7: 8, 8: 7}),
- )
+ answer = matching_dict_to_set({1: 2, 2: 1, 3: 5, 4: 6, 5: 3, 6: 4, 7: 8, 8: 7})
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_s_blossom_relabel_expand(self):
"""Create S-blossom, relabel as T, expand:"""
@@ -205,10 +231,9 @@ class TestMaxWeightMatching:
(5, 7, 13),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set({1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4}),
- )
+ answer = matching_dict_to_set({1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4})
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_nested_s_blossom_relabel_expand(self):
"""Create nested S-blossom, relabel as T, expand:"""
@@ -226,10 +251,9 @@ class TestMaxWeightMatching:
(5, 6, 7),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set({1: 8, 2: 3, 3: 2, 4: 7, 5: 6, 6: 5, 7: 4, 8: 1}),
- )
+ answer = matching_dict_to_set({1: 8, 2: 3, 3: 2, 4: 7, 5: 6, 6: 5, 7: 4, 8: 1})
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_nasty_blossom1(self):
"""Create blossom, relabel as T in more than one way, expand,
@@ -250,12 +274,10 @@ class TestMaxWeightMatching:
(9, 10, 5),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set(
- {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}
- ),
- )
+ ansdict = {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}
+ answer = matching_dict_to_set(ansdict)
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_nasty_blossom2(self):
"""Again but slightly different:"""
@@ -274,12 +296,10 @@ class TestMaxWeightMatching:
(9, 10, 5),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set(
- {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}
- ),
- )
+ ans = {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}
+ answer = matching_dict_to_set(ans)
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_nasty_blossom_least_slack(self):
"""Create blossom, relabel as T, expand such that a new
@@ -300,12 +320,10 @@ class TestMaxWeightMatching:
(9, 10, 5),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set(
- {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}
- ),
- )
+ ans = {1: 6, 2: 3, 3: 2, 4: 8, 5: 7, 6: 1, 7: 5, 8: 4, 9: 10, 10: 9}
+ answer = matching_dict_to_set(ans)
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_nasty_blossom_augmenting(self):
"""Create nested blossom, relabel as T in more than one way"""
@@ -329,25 +347,23 @@ class TestMaxWeightMatching:
(11, 12, 5),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set(
- {
- 1: 8,
- 2: 3,
- 3: 2,
- 4: 6,
- 5: 9,
- 6: 4,
- 7: 10,
- 8: 1,
- 9: 5,
- 10: 7,
- 11: 12,
- 12: 11,
- }
- ),
- )
+ ans = {
+ 1: 8,
+ 2: 3,
+ 3: 2,
+ 4: 6,
+ 5: 9,
+ 6: 4,
+ 7: 10,
+ 8: 1,
+ 9: 5,
+ 10: 7,
+ 11: 12,
+ 12: 11,
+ }
+ answer = matching_dict_to_set(ans)
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
def test_nasty_blossom_expand_recursively(self):
"""Create nested S-blossom, relabel as S, expand recursively:"""
@@ -367,12 +383,17 @@ class TestMaxWeightMatching:
(4, 9, 30),
]
)
- assert_edges_equal(
- nx.max_weight_matching(G),
- matching_dict_to_set(
- {1: 2, 2: 1, 3: 5, 4: 9, 5: 3, 6: 7, 7: 6, 8: 10, 9: 4, 10: 8}
- ),
- )
+ ans = {1: 2, 2: 1, 3: 5, 4: 9, 5: 3, 6: 7, 7: 6, 8: 10, 9: 4, 10: 8}
+ answer = matching_dict_to_set(ans)
+ assert_edges_equal(nx.max_weight_matching(G), answer)
+ assert_edges_equal(nx.min_weight_matching(G), answer)
+
+ def test_wrong_graph_type(self):
+ error = nx.NetworkXNotImplemented
+ raises(error, nx.max_weight_matching, nx.MultiGraph())
+ raises(error, nx.max_weight_matching, nx.MultiDiGraph())
+ raises(error, nx.max_weight_matching, nx.DiGraph())
+ raises(error, nx.min_weight_matching, nx.DiGraph())
class TestIsMatching:
@@ -515,3 +536,9 @@ class TestMaximalMatching:
matching = nx.maximal_matching(G)
assert len(matching) == 1
assert nx.is_maximal_matching(G, matching)
+
+ def test_wrong_graph_type(self):
+ error = nx.NetworkXNotImplemented
+ raises(error, nx.maximal_matching, nx.MultiGraph())
+ raises(error, nx.maximal_matching, nx.MultiDiGraph())
+ raises(error, nx.maximal_matching, nx.DiGraph())
diff --git a/networkx/classes/graph.py b/networkx/classes/graph.py
index 4ba639f4..216a37d2 100644
--- a/networkx/classes/graph.py
+++ b/networkx/classes/graph.py
@@ -189,9 +189,7 @@ class Graph:
dict-like object. In general, the dict-like features should be
maintained but extra features can be added. To replace one of the
dicts create a new graph class by changing the class(!) variable
- holding the factory for that dict-like structure. The variable names are
- node_dict_factory, node_attr_dict_factory, adjlist_inner_dict_factory,
- adjlist_outer_dict_factory, edge_attr_dict_factory and graph_attr_dict_factory.
+ holding the factory for that dict-like structure.
node_dict_factory : function, (default: dict)
Factory function to be used to create the dict containing node
diff --git a/networkx/tests/test_all_random_functions.py b/networkx/tests/test_all_random_functions.py
index 0611462e..e8aaba1e 100644
--- a/networkx/tests/test_all_random_functions.py
+++ b/networkx/tests/test_all_random_functions.py
@@ -47,6 +47,8 @@ def run_all_random_functions(seed):
sizes = (20, 20, 10)
colors = [1, 2, 3]
G = nx.barbell_graph(12, 20)
+ H = nx.cycle_graph(3)
+ H.add_weighted_edges_from((u, v, 0.2) for u, v in H.edges)
deg_sequence = [3, 2, 1, 3, 2, 1, 3, 2, 1, 2, 1, 2, 1]
in_degree_sequence = w = sequence = aseq = bseq = deg_sequence
@@ -69,6 +71,18 @@ def run_all_random_functions(seed):
t(nx.spectral_ordering, G, seed=seed)
# print('starting average_clustering')
t(approx.average_clustering, G, seed=seed)
+ t(approx.simulated_annealing_tsp, H, "greedy", source=1, seed=seed)
+ t(approx.threshold_accepting_tsp, H, "greedy", source=1, seed=seed)
+ t(
+ approx.traveling_salesman_problem,
+ H,
+ method=lambda G, wt: approx.simulated_annealing_tsp(G, "greedy", wt, seed=seed),
+ )
+ t(
+ approx.traveling_salesman_problem,
+ H,
+ method=lambda G, wt: approx.threshold_accepting_tsp(G, "greedy", wt, seed=seed),
+ )
t(nx.betweenness_centrality, G, seed=seed)
t(nx.edge_betweenness_centrality, G, seed=seed)
t(nx.edge_betweenness, G, seed=seed)