diff --git a/graphE/.#napoleon.py b/graphE/.#napoleon.py deleted file mode 120000 index ed80820..0000000 --- a/graphE/.#napoleon.py +++ /dev/null @@ -1 +0,0 @@ -jet08013@woodmont.27750:1530263946 \ No newline at end of file diff --git a/graphE/graphE.tex b/graphE/graphE.tex index 080995d..f44e7ae 100644 --- a/graphE/graphE.tex +++ b/graphE/graphE.tex @@ -169,13 +169,18 @@ (For reference the adjacency matrix of the graph has about 6500 numbers). Applying TSNE to the result we get the following picture, colored by the conference that the teams belong to. - \includegraphics[width=4in]{football_clusters.png} + \includegraphics[width=4in]{football_clusters.png} \end{frame} - - - - + \begin{frame}{Remarks for further study} + \begin{block}{Optimizing the computation} + Computing the softmax function is very inefficient because the normalization step requires a sum + over all of the nodes of the graph. \textit{Negative Sampling} and \textit{Hierarchical Softmax} are approaches to dealing with this bottleneck. + \end{block} + \begin{block}{Modifying the algorithm and adjusting the hyperparameters} + There are many ways to vary the way the random walk is carried out, and one can vary the lengths of the walks and the number of nodes chosen within each random walk. One can also modify the way the random walk is constructed (for example, by weighting the edges in some way). The program \texttt{node2vec} has many capabilities of this type. + \end{block} +\end{frame} \end{document} %%% Local Variables: diff --git a/jax_gl/README.md b/jax_gl/README.md new file mode 100644 index 0000000..ed90f12 --- /dev/null +++ b/jax_gl/README.md @@ -0,0 +1,3 @@ +This work is obsolete and superseded by **graphE**. + +