• English
    • Ελληνικά
    • Deutsch
    • français
    • italiano
    • español
  • Deutsch 
    • English
    • Ελληνικά
    • Deutsch
    • français
    • italiano
    • español
  • Einloggen
Dokumentanzeige 
  •   DSpace Startseite
  • Επιστημονικές Δημοσιεύσεις Μελών ΠΘ (ΕΔΠΘ)
  • Δημοσιεύσεις σε περιοδικά, συνέδρια, κεφάλαια βιβλίων κλπ.
  • Dokumentanzeige
  •   DSpace Startseite
  • Επιστημονικές Δημοσιεύσεις Μελών ΠΘ (ΕΔΠΘ)
  • Δημοσιεύσεις σε περιοδικά, συνέδρια, κεφάλαια βιβλίων κλπ.
  • Dokumentanzeige
JavaScript is disabled for your browser. Some features of this site may not work without it.
Gesamter Bestand
  • Bereiche & Sammlungen
  • Erscheinungsdatum
  • Autoren
  • Titeln
  • Schlagworten

Knowledge distillation on neural networks for evolving graphs

Thumbnail
Autor
Antaris S., Rafailidis D., Girdzijauskas S.
Datum
2021
Language
en
DOI
10.1007/s13278-021-00816-1
Schlagwort
Deep neural networks
Distillation
Dynamic graph
Evolving graphs
Graph representation
Graph representation learning
Knowledge distillation
Neural-networks
ON dynamics
Real-world
Student Modeling
Teacher models
Cell proliferation
Springer
Zur Langanzeige
Zusammenfassung
Graph representation learning on dynamic graphs has become an important task on several real-world applications, such as recommender systems, email spam detection, and so on. To efficiently capture the evolution of a graph, representation learning approaches employ deep neural networks, with large amount of parameters to train. Due to the large model size, such approaches have high online inference latency. As a consequence, such models are challenging to deploy to an industrial setting with vast number of users/nodes. In this study, we propose DynGKD, a distillation strategy to transfer the knowledge from a large teacher model to a small student model with low inference latency, while achieving high prediction accuracy. We first study different distillation loss functions to separately train the student model with various types of information from the teacher model. In addition, we propose a hybrid distillation strategy for evolving graph representation learning to combine the teacher’s different types of information. Our experiments with five publicly available datasets demonstrate the superiority of our proposed model against several baselines, with average relative drop 40.60 % in terms of RMSE in the link prediction task. Moreover, our DynGKD model achieves a compression ratio of 21:100, accelerating the inference latency with a speed up factor × 30 , when compared with the teacher model. For reproduction purposes, we make our datasets and implementation publicly available at https://github.com/stefanosantaris/DynGKD. © 2021, The Author(s).
URI
http://hdl.handle.net/11615/70642
Collections
  • Δημοσιεύσεις σε περιοδικά, συνέδρια, κεφάλαια βιβλίων κλπ. [19735]

Verwandte Dokumente

Anzeige der Dokumente mit ähnlichem Titel, Autor, Urheber und Thema.

  • Thumbnail

    Online graph exploration with advice 

    Dobrev, S.; Královič, R.; Markou, E. (2012)
    We study the problem of exploring an unknown undirected graph with non-negative edge weights. Starting at a distinguished initial vertex s, an agent must visit every vertex of the graph and return to s. Upon visiting a ...
  • Thumbnail

    Different speeds suffice for rendezvous of two agents on arbitrary graphs 

    Kranakis E., Krizanc D., Markou E., Pagourtzis A., Ramírez F. (2017)
    We consider the rendezvous problem for two robots on an arbitrary connected graph with n vertices and all its edges of length one. Two robots are initially located on two different vertices of the graph and can traverse ...
  • Thumbnail

    GRATIS: A GRaph tool for information systems scientists 

    Vlachos V., Siklafidis T., Chantzi K. (2019)
    All technological, biological and social networks can be represented as graphs. Therefore, graphs are utilised in simulation-based studies of new algorithms and protocols in various scientific fields. This paper presents ...
htmlmap 

 

Stöbern

Gesamter BestandBereiche & SammlungenErscheinungsdatumAutorenTitelnSchlagwortenDiese SammlungErscheinungsdatumAutorenTitelnSchlagworten

Mein Benutzerkonto

EinloggenRegistrieren
Help Contact
DepositionAboutHelpKontakt
Choose LanguageGesamter Bestand
EnglishΕλληνικά
htmlmap