Thanks to my collaborators
Graphs have long been used in geography and GIScience
Graph Neural Networks (GNN) were developed in machine learning
\[ h_{v}^{(l)} = \sigma \left( W^{(l)} \sum_{u \in N(v)} \frac{1}{|N(v)|} h_{u}^{(l-1)} \right) \]
\[ h_{v}^{(l)} = \sigma \left( W^{(l)} \ {\scriptstyle COMBINE} \left( h_{v}^{l-1}, {\scriptstyle AGGREGATE} \left( \bigl\{ h_{u}^{(l-1)}, \forall u \in N(v) \bigl\} \right) \right) \right) \]
Unsupervised learning of nodes representations
The analysis of urban physical form or built form (Batty 2008)
Topics
Approaches
(De Sabbata, Ballatore, Liu, et al. 2023)
Pre-processing
Model
Leicester (UK)
Node embeddings | Ego-graph emb. | ||||
---|---|---|---|---|---|
Measure | Fist dimension | Second dimension | Fist dimension | Second dimension | |
Node in city | |||||
closeness centrality | 0.262*** | -0.194*** | 0.365*** | -0.337*** | |
betweenness centrality | 0.242*** | -0.026*** | 0.117*** | -0.155*** | |
Ego-graph | |||||
count of nodes | -0.033*** | -0.104*** | -0.138*** | -0.226*** | |
count of edges | 0.013* | -0.101*** | -0.068*** | -0.213*** | |
average node degree | 0.261*** | 0.005 | 0.377*** | 0.037*** | |
total edge length | 0.210*** | -0.131*** | 0.208*** | -0.246*** | |
average edge length | 0.370*** | -0.045*** | 0.580*** | -0.022*** | |
average count of streets per node | 0.280*** | -0.232*** | 0.431*** | -0.421*** | |
count of intersections | 0.047*** | -0.144*** | -0.019*** | -0.302*** | |
total street segment length | 0.192*** | -0.163*** | 0.190*** | -0.315*** | |
count of street segments | 0.009 | -0.134*** | -0.070*** | -0.285*** | |
average street segment length | 0.365*** | -0.044*** | 0.589*** | -0.015* | |
average street circuity | -0.028*** | 0.131*** | -0.066*** | 0.225*** |
GNNs can be used as an unsupervised framework to explore urban form
Future work
Crucial tools in quantitative geography (Webber and Burrows 2018)
Can we automatically identify the two groups visible in the scatterplot, without any previous knowledge of the groups?
Methods:
\[ m'_i=\alpha m_i+\beta\frac{1}{A}\sum_j^n{w_{ij}m_j} \]
Intuition: is membership update akin to graph convolution?
\[ h_{v}^{(l)} = \sigma \left( W^{(l)} \ {\scriptstyle COMBINE} \left( h_{v}^{l-1}, {\scriptstyle AGGREGATE} \left( \bigl\{ h_{u}^{(l-1)}, \forall u \in N(v) \bigl\} \right) \right) \right) \]
Data
Evaluation framework
Our GNN framework has the potential to develop into a wide range of approaches
Graph Neural Networks hold great potential in urban analytics
Foundation models will be cornerstones many future methods and studies
Dr Stef De Sabbata (they/them)
Associate Professor of Geographical Information Science at the School of Geography, Geology and the Environment
Research theme lead for Cultural Informatics at the Institute for Digital Culture
University of Leicester, University Road, Leicester, LE1 7RH, UK
Contact: s.desabbata@le.ac.uk
Check out my GitHub repos at: github.com/sdesabbata
Systematic search of the design space (You, Ying, and Leskovec 2020)
Graph attentional operator defined by Veličković et al. (2018)
\[ \mathbf{x}^{\prime}_i = \alpha_{i,i}\mathbf{\Theta}_{s}\mathbf{x}_{i} + \sum_{j \in N(i)} \alpha_{i,j}\mathbf{\Theta}_{t}\mathbf{x}_{j} \]
\[ \alpha_{i,j} = \frac{ \exp\left(\mathrm{LeakyReLU}\left( \mathbf{a}^{\top}_{s} \mathbf{\Theta}_{s}\mathbf{x}_i + \mathbf{a}^{\top}_{t} \mathbf{\Theta}_{t}\mathbf{x}_j \right)\right)} {\sum_{k \in N(i) \cup \{ i \}} \exp\left(\mathrm{LeakyReLU}\left( \mathbf{a}^{\top}_{s} \mathbf{\Theta}_{s}\mathbf{x}_i + \mathbf{a}^{\top}_{t}\mathbf{\Theta}_{t}\mathbf{x}_k \right)\right)} \]
Graph isomorphism operator defined by Xu et al. (2019)
\[ \mathbf{x}^{\prime}_i = h_{\mathbf{\Theta}} \left( (1 + \epsilon) \cdot \mathbf{x}_i + \sum_{j \in N(i)} \mathbf{x}_j \right) \]
where \(h_{\mathbf{\Theta}}\) is a multi-layer perceptron (MLP)
Modified graph isomorphism operator defined by Hu et al. (2020a) to incorporate edge features
\[ \mathbf{x}^{\prime}_i = h_{\mathbf{\Theta}} \left( (1 + \epsilon) \cdot \mathbf{x}_i + \sum_{j \in N(i)} \mathrm{ReLU} ( \mathbf{x}_j + \mathbf{e}_{j,i} ) \right) \]
where \(h_{\mathbf{\Theta}}\) is a multi-layer perceptron (MLP)