Graphcore Intelligence Processing Units have demonstrated standout performance running a wide range of GNNs. Find out why GNNs work so well on IPUs.
The HGT uses attention over features of each node and edge type in a heterogeneous graph instead of over the tokens in a sentence or pixels in an image/video as is done with traditional transformers.
We are excited to announce the release of PyG 2.2 🎉🎉🎉
PyG 2.2 is the culmination of work from 78 contributors who have worked on features and bug-fixes for a total of over 320 commits since torch-geometric==2.1.0.
GraphGym is a platform for designing and evaluating Graph Neural Networks (GNNs), as originally proposed in the “Design Space for Graph Neural Networks” paper. We now officially support GraphGym as part of of PyG.
PyG (PyTorch Geometric) has been moved from the personal account rusty1s to its own organization account pyg-team to emphasize the ongoing collaboration between TU Dortmund University, Stanford University and many great external contributors. With this, we are releasing PyG 2.0, a new major release that brings sophisticated heterogeneous graph support, GraphGym and many other exciting features to PyG.
We covered the release in our last town hall meeting. We’ll be posting all recorded events on the PyG Youtube channel going forward.
Our second sprint, which took place in December focused on explainability. We had 18 contributors submitting over 30 commits and contributing to complex projects.
PyG core team members Rex Ying and Jiaxuan You presented the latest research in AutoML. There are millions of potential GNN designs for solving any problem (i.e. architecture design, hyperparameters / hidden dimensions, learning configs, etc), so picking the best design for your specific problem can be challenging. GraphGym, a design space to manage GNN experiments, makes it easy to run a set of experiments, capture results, reproduce, and run different models / data sets simply by modifying a config file.
We are running regular community sprints to get our community more involved in building with PyG. Whether you are just beginning to use graph learning or have been leveraging GNNs in research or production, the community sprints welcome members of all levels with different types of projects.
This September, we had our annual graph learning workshop hosted by Stanford. We had a terrific lineup of speakers - the PyG core team showcased the latest release and newest functionality, leading researchers showcased the latest and most advanced model architectures in their various fields, and industry leaders in finance, pharma, biomedicine, social media, hardware showcased latest innovations in graph learning at their respective organizations. It’s clear that graph learning has a very meaningful role to play in some of the most important problems at scale - learn more and see the recordings of all the talks on the main event page here.
By using this website, you agree to the storing of cookies on your device to enhance site navigation.