Projet ANR CodeGNN




CodeGNN is a collaborative project supported by three research labs namely:

Laboratory Supporting institution Towns
GREYC LogoENSICAEN_institutionnel_couleurs Caen
logo_litis logo-univ-rouen-normandie-couleur Rouen and Le Havre
LIFAT logo_PolytechTours Tours

These three labs aim to investigate Graph convolution networks in order to solve 3 important bottlenecks:

  1. Convolution on graphs correspond mainly to low pass filters. We plan to propose new convolution schemes extending the range of available/learnable frequencies together with the items (sub graphs or vertices) on which these convolutions are applied.
  2. Few methods of graph downsampling have been proposed so far. We plan to desing new downsampling/pooling methods allowing a fixed decimation ratio, a better control of the extend of each clusters and a preservation of the spectral properties of graphs.
  3. Designing convolutional Graph neural network is not a so easy task. The problem becomes even more difficult when one consider sequences of graphs instead of graph instances. We plan to investigate these types of objects in order to propose new integration schemes of the temporal and spatial (graph) information.

These three research lines come with three applications conducted in close collaborations with three external research laboratories:

  1. Investigating chemical properties using new convolution schemes and the electronic properties of atoms in close collaboration with :
  2. Learning automatically the best clustering of the atoms of a molecule in order to predict its biological effect. We would like thus to affine thanks to a learning scheme, the pharmacophoric graphs used by biologists when designing a new drug.
  3. Analysing sequences of fMRI acquisitions and sequences of body positions using techniques derived from the the analysis of graph sequences.


A large amount of the recent advances in Artificial Intelligence may be connected to the Statistical pattern recognition field. Using continuous optimisation, this research field defines a global numerical description of objects that is then combined with machine learning techniques. On the other hand, most of the objects of interest of our today life are based on discrete objects with sequential (strings) or more complex (graph) relationships. We can evoke among many others, the relationships between people in social graphs, the bounds between atoms in a molecule or the topographic distance between speed sensors in traffic analysis, to name a few.

The prediction of the properties of such objects falls in the scope of structural pattern recognition. For decades, this research field has been limited by costly (e.g. based on subgraph isomorphism) or poorly efficient metrics usually combined with limited machine learning algorithms (mainly k-nearest neighbours algorithm). A first important breakthrough has been achieved by the introduction of structural kernels dedicated to strings or graphs. In addition to provide efficient metrics on these discrete objects, the latter are gateway towards many machine learning methods; Hence, they reduce the gap between structural and statistical pattern recognition techniques.

A second breakthrough in this field has been provided by the introduction of Graph Neural Networks (GNNs). As graph kernels, these networks provide a strong connection between graphs and machine leaning techniques. Moreover, as other deep learning techniques, GNNs avoid handcrafting the design of a similarity measure between graphs. GNNs are based on two operations, namely: Graph convolution and Graph decimation/pooling. WP1 and WP2 are dedicated to overcome current limitations on these two operations. WP1 will investigate the design of more selective convolution filters in the spectral domain while WP2 will consider alternatives to clustering as a base for a Graph decimation. Dynamicity of time-evolving graphs constitutes a key concern in many practical applications, but is much less investigated. The point here is to perform a prediction about graph sequences instead of single graphs.

Applications concern, among others, the analysis of Brain’s graph sequences, traffic forecasting, action recognition . . . . This point will be addressed by WP3. This work package will combine the findings of WP1 and WP2 while integrating the time dimension. In close collaboration with brain specialists, this package will target the prediction of waking and falling asleep phases in functional MRI. Such a prediction is fundamental for a correct interpretation of functional MRI sequences where patients and especially the elderly can go through different levels of consciousness


The project groups 3 French research laboratories, namely the GREYC (UMR 6072), the LITIS EA 4108 and the LIFAT EA 6300 – ERL CNRS 7002.

The members of the project are listed bellow:

Partner Name First Name Position Role Expertise in the project
GREYC Brun Luc PR Project coordinator
WP leader WP2
Graph Decimation
GREYC Cuissart Bertrand MCF major: graph algorithmics,
minor : aggregation of information
GREYC Bougleux Sébastien MCF major: Analysis of time-varying graphs
GREYC Lechervy Alexis MCF major: Machine learning,
minor: Analysis of time sequences
 GREYC Stanovic Stevan PhD student GREYC/LITIS major: Graph Reduction
logo_litis Adam Sébastien PR Local Coordinator LITIS major:Graph convolution,
minor: Graph neural networks for spatio-temporal data
logo_litis Héroux Pierre MCF WP leader WP1 major:Graph convolution
logo_litis Chatelain Clément MCF major:Recurrent Neural Networks
logo_litis Gaüzère Benoît MCF major:graph similarity measures
minor:Graph convolutions
logo_litis Honeine Paul PR major:Graph pre-image \& autoencoders
minor:learning theory
logo_litis Yger Florian MCF*
major:Graph representation
minor: NN on Riemannian manifolds
LIFAT Conte Donatello MCF WP Leader WP3 major:graphs sequences analysis
minor: GCNN
LIFAT Ramel Jean-Yves PR Local Coordinator LIFAT major:Graph Embedding
minor: Graph based representations
LIFAT Raveaux Romain MCF major:GCNN in graph space
LIFAT Ragot Nicolas MCF major: Recurrent Neural Networks

Comments are closed, but trackbacks and pingbacks are open.