Graph-Theoretic Analysis of Belief System Dynamics under Logic Constraints

Opinion formation cannot be modeled solely as an ideological deduction from a set of principles; rather, repeated social interactions and logic constraints among statements are consequential in the construct of belief systems. We address three basic questions in the analysis of social opinion dynamics: (i) Will a belief system converge? (ii) How long does it take to converge? (iii) Where does it converge? We provide graph-theoretic answers to these questions for a model of opinion dynamics of a belief system with logic constraints. Our results make plain the implicit dependence of the convergence properties of a belief system on the underlying social network and on the set of logic constraints that relate beliefs on different statements. Moreover, we provide an explicit analysis of a variety of commonly used large-scale network models.

Definition 1 ( 1,2 ) Let A be a m × n matrix, and C be a p × q matrix, the Kronecker product A ⊗ C is the mp × nq matrix defined as: Next, we will enumerate some useful properties of the Kronecker product.
1. Bilinearity and associativity: for matrices A, B and C, and a scalar k, it holds: 2. Non-Commutative: In general A ⊗ B = B ⊗ A. However, there exist commutation matrices P and Q such that: and if A and B are square matrices then P = Q .
Next, we introduce the Kronecker product of graphs and some of its properties.
Definition 2 ( 1 Definition 1) The Kronecker (also known as categorical, direct, cardinal, relational, tensor, weak direct or conjunction) product G = G 1 ⊗ G 2 of two graphs G 1 = (V 1 , E 1 ) and Moreover, the adjacency matrix of the graph G is the Kronecker product of the adjacency matrices of G 1 and G 2 .
Theorem 1 ( 3 Theorem 1 ) Let G and H be strongly connected graphs. Let

Supplementary Note 2: Main Technical Results
We now describe the proof of our main result, namely the number of interactions required for a belief system to be arbitrarily close to its limiting set of beliefs. We start with a technical lemma about the strongly connected components of the product of two graphs.
Lemma 1 Given two graphs G 1 and G 2 , every strongly connected component of the Kronecker product graph G 1 ⊗ G 2 is the result of the Kronecker product of a strongly connected component of G 1 and a strongly connected component of Proof 1 Let A 1 and A 2 denote the adjacency matrices for the graphs G 1 and G 2 , respectively. We can construct a condensation of the graph G by contracting every strongly connected component to a single vertex, resulting in a directed acyclic graph. Thus, a topological ordering is possible (see Cormen et al. 4 Section 22.4) and there always exists two permutation matrices P 1 and P 2 such that we can rearrange the matrices A 1 and A 2 into a block upper triangular form where each of the blocks is a strongly connected component, that is Moreover, define P = P 1 ⊗ P 2 and by the properties of the Kronecker product, cf., Definition 1, it follows that where P is also a permutation matrix and Finally, by Property 2 in Definition 1 there exists a permutation matrix Q such that Therefore, every block in the upper triangular block diagonal form of the product of two adjacency matrices is the product of two strongly connected components, one from each graph.
We are now ready to state our main technical result regarding the expected mixing time of a Markov Chain whose transition probability matrix is a Kronecker product of two stochastic matrices.
Lemma 3 Let P be a graph with at least one closed strongly connected component, and assume all its closed strongly connected components are aperiodic. Also, let L be the maximum expected coupling time of a random walk in a closed strongly connected component of P. Moreover, let H be maximum expected time for a random walk, starting at an arbitrary node, to get absorbed into a closed strongly connected component. Then, for k ≥ 4(L + H) log(1/ε), it holds for the belief system described in equation Proof 2 We use the coupling method to bound the convergence time of the belief system 5 . Initially, we show that all opinions x i k , such that i lies in a closed strongly connected component, will converge to some stationary point. Thus, in what follows we will find the required time to reach some ε-consensus via coupling arguments, which in turn will provide the required time for a belief system to be ε close to its stationary distribution.
Let i be a node belonging to a closed strongly connected component S and let P S be the matrix obtained by looking at the minor of P corresponding to entries in S. If S is closed then P S is row-stochastic, and Perron-Frobenius theory tells us there exists some vector π S such that π S P S = π S . Now, define two independent random walks X = (X k ) ∞ 0 and Y = (Y n ) ∞ 0 with the same transition matrix P S . X starts from the distribution π S , and Y from some other arbitrary stochastic vector v. Moreover, couple the processes Y and X by defining a new process W such that Each random walk moves according to P S , so if we correlate them by moving them together after they intersect, we have not changed the fact that, individually, they move according to P S . With this construction of the coupling 6 Theorem 5.2, we have that k .
Therefore, to be at a distance of at most 1/4 we require k = 4 max v E[K]. We say the mixing time of the random walk is 4L where we have that L = max v E[K] is the maximum expected time it takes for the random walks X and Y in S to intersect. Then, it follows that in order to be ε close to the stationary distribution we require at least k ≥ 4L log(1/ε) steps 6 Eq. 4.36, for any v. Therefore, we have shown that x i k for i in a closed strongly connected component S converges to π S x S 0 at a geometric rate. Here x S 0 stacks those x i 0 that belong to S.
where Z is strongly connected and substochastic, meaning some rows add up to less than 1. The entries of y k come from nodes in other strongly connected components and the matrix R represents how they influence the nodes in M. Initially, assume that y k converges and call its limit y ∞ . Now, consider a random walk that moves around M according to Z; the moment it steps out of M into another strongly connected component we say it is absorbed by it since it can not return to M.
Let q i k be the probability the walk is at state i in M at time k. Then q k+1 = q k Z, and let H i be the expected time to get absorbed into any other strongly connected component, the set of nodes in M is connected to, starting from node i and let If the absorbing strongly connected component is closed, then H = H 1 . On the other hand, the absorbing strongly connected component will have some other absorbing time H 2 , i.e., the time to get absorbed into another strongly connected component. Thus, the total absorbing time H is the sum of the absorbing times of the strongly connected components on the longest path on the condensation of the graph G from an open strongly connected component to a closed strongly connected component. The condensation of the graph G is a directed acyclic graph and such path always exist.
By the Markov inequality, regardless of where the random walk starts, the probability that it takes more than 4H iterations to get absorbed is at most 1/4. Thus, for all k ≥ 4H log(1/ε) steps we have that q k 1 < ε. Now, let z ∞ be the vector that satisfies which we know exists since every eigenvalue of Z must be strictly less than 1 (since Z k → 0). If we define then subtracting the updates of x M and z ∞ , It follows that ∆ k goes to zero since we have assumed that y k → y ∞ , and Z k → 0.
In conclusion, this argument shows that for all k ≥ 4(L + H) log(1/ε) steps every node is within ε of its limiting value.
The next lemma states the relation of the coupling and absorbing time for random walks on product graphs. Specifically, it shows a maximum-type behavior where the coupling and absorbing time of the product system is the maximum of coupling and aborning of the factors.
Lemma 4 Consider two aperiodic strongly connected directed graphs G 1 and G 2 . The expected coupling time of two random walks on the graph G 1 ⊗ G 2 is L = 8 max{L 1 , L 2 }, where L 1 and L 2 are the expected coupling times for random walks on the graphs G 1 and G 2 respectively. Similarly, a random walk on an open strongly connected component of a graph G 1 ⊗ G 2 has an expected absorbing time (into another strongly connected component) of H = 8 max{H 1 , H 2 }, where H 1 and H 2 are the expected absorbing times for random walks on the graphs G 1 and G 2 respectively. Proof 3 Say both graphs G 1 and G 2 are aperiodic and strongly connected, their product is also aperiodic and strongly connected and there exists a limiting distribution π for a random walk moving on the Kronecker product graph G 1 ⊗ G 2 .
Consider a random walk X = (X k ) ∞ 0 , on the graph G 1 ⊗ G 2 , with transition matrix A 1 ⊗ A 2 starting with some arbitrary distribution v, where A 1 is the transition probability on a random walk on the graph G 1 and A 2 is the transition probability on a random walk on the graph G 2 . Moreover, from the definition of the Kronecker product of graphs, we have that the state space of G 1 ⊗ G 2 is the Cartesian product V = V 1 ×V 2 , composed by the ordered pairs (i, j) for i ∈ V 1 and j ∈ V 2 . Thus, the probability that the random walk X jumps from the node (i, j) to the node (ī,j) is Following the coupling method, define another random walk Y = (Y k ) ∞ 0 with the same transition matrix A 1 ⊗ A 2 but starting at the stationary distribution π. Now, construct an new random walk as follows: Clearly, if the state of the random walk X at time k is X k = (i k , j k ) and the state of the random walk Y at time k is Y k = (ī k ,j k ), then the condition Y k = X k implies that i k =ī k and j k =j k . Thus, the coupling time K can alternatively be expressed in terms of the two separate conditions i k =ī k and j k =j k , which in turn represents the coupling conditions for two separate random walks on each individual coordinate where each coordinate represents one of the factor graphs. Therefore, we write the coupling time between the random walks X and Y as K = min {k ≥ 0 : Y k = X k } = min {k ≥ 0 : i k =ī k , j k =j k } which is equivalent to where K 1 and K 2 are the coupling times for the graphs G 1 and G 2 respectively. Thus, where the last inequality follows from the union bound. Note that given that the initial state of the random walk X is v, the random walks on each of its coordinates have some well defined initial state, v 1 (i) = ∑ j∈V 2 v((i, j)) and v 2 ( j) = ∑ i∈V 2 v((i, j)), where v 1 (i) is the probability of staring in node i ∈ V 1 , v 2 ( j) is the probability of starting in node j ∈ V 2 , and v((i, j)) is the probability of the random walk X to start in the node (i, j).
It follows from Theorem 5.2 in Levin et. al. 6 that Thus, in order to be at a distance at most 1/4 from the stationary distribution we require k ≥ 8 max{L 1 , L 2 }. Moreover, in order to be ε close to the stationary distribution we require at least k ≥ 8 max{L 1 , L 2 } log(1/ε) steps in the random walk for any initial state v. Finally, the coupling time of X is L = O(max{L 1 , L 2 }).