Comments:

Read the paper, add your comments…

Comments:

The paper is about a scalable alternative to $k$-order Markov process called k-LAMP. k-LAMP only needs $nnz(T)+k$ (where $T$ is the transition matrix and $nnz(T)$ is the number of nonzero entries in $T$) parameters, while $k$-order Markov process needs as many parameters as the number of paths of length $k+1$ in $T$. A generalized version called $k$-GLAMP is also suggested, it needs $k*nnz(T)+k$ parameters. An experimental comparison to Markov process and LSTM (Long-Short-Term-Memory) seems convincing. ### Typos: - page 5, Theorem 11: "\n this version.)"
Read the paper, add your comments…

Comments:

Read the paper, add your comments…

Comments:

Read the paper, add your comments…

Comments:

### Arboricity is different from degeneracy: - "A classic bound on the degeneracy asserts that $\alpha(G)\leq \sqrt{2m}$ (Lemma 1 of [12])." [12] is related to the arboricity! The degeneracy $\alpha$ is the maximum of the minimum degree of any subgraph. The arboricity $\beta$ is the minimum number of forests needed to cover all edges in the graph. In [12], $\beta\leq ceil(\sqrt{2m+n})/2$ is proven. We have, $\beta \leq \alpha \leq 2 \beta − 1$. - "By a theorem of Chiba-Nishizeki [12], we can argue that the number of vertices in any set of the Turan shadow is at most $\sqrt{2m}$ (where $m$ is the number of edges in G)." We can actually state that the number of vertices in any set of the Turan shadow is at most $\alpha$ where $\alpha$ is the degeneracy of the input graph (in large and sparse real-world graphs we generally have $\alpha<<\sqrt{2m}$). ### Tight bound on the shadow size: "Indeed, we can design instances where Claim 5.7 is tight (a set of $n/\alpha$ Erdos-Renyi graphs $G(\alpha,1/3))$." It is not clear why this is a tight bound. $\alpha$ is the degeneracy of the input graph (the graph "$n/\alpha$ Erdos-Renyi graphs $G(\alpha,1/3))$" may not have degeneracy $\alpha$). ### Scalability of an enumeration procedure: "The primary challenge is combinatorial explosion. An autonomous system network with ten million edges has more than a trillion 10-cliques. Any enumeration procedure is doomed to failure." The fact that there can be "a trillion 10-cliques" may not be enough to discard an enumeration procedure. For instance, the program (counting from 1 to one trillion) generated by the following C code terminates within few minutes on a commodity machine. ```c main() { unsigned long long i=0; while (i < 1e12) { i++; } } ``` ### Minors: - "Denote the number of k-cliques by $C$, then the success probability is $C/{ n \choose k}$. Thus, we can estimate this probability using ${ n \choose k}/C$ samples." This may need some explanations. ### Typos: - "It will be convenient to express this result in terms on k-clique densities."
Read the paper, add your comments…
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42