Network
Network
This paper aims to understand topological closeness that is automatically normalized introduces
a full family of spatial interaction models of which the unconstrained model is just the start. And
indeed since then, there have been all number of incremental advances and alternatives (such as
relaxed models, Stillwell’s origin/destination parameter specific models and Dennett and
Wilson’s own multi-level model (to name just a few).In this session we will explore the rest of
Wilson’s family - the Production (origin) Constrained Model; the Attraction (destination)
constrained model; and the Doubly Constrained Model. We will see how we can, again, use a
Poisson regression model in R to calibrate these models and how, once calibrated, we can use the
models in different contexts, such as Land Use Transportation Interaction (LUTI) modelling,
Topological Network
We first need to go to the following website to extract the shape file for the bit of street network
that we need http://extract.bbbike.org. We need to break the line strings up into individual
segments. A simple approach is to loop over the geo data frame rows and then over each line
string. There are 55 disconnected components, so let's just take the largest one. We Create a
weighted graph from the ldn_segs geo data frame althoughThere seem to be some duplicated
edges, so we use a multigraph. networkx is bad at drawing large graphs, so we will be using
graphviz to make a lawet. matplotlib finds this hard to draw, expect it to take some time!
We now create some coordinates to use for plotting and convert the points geo data frame info
to a dictionary of positions
Centrality measures
The next to do is computing closeness centrality - takes too long. Unfortunately networkx doesn't
implement a cutoff like igraph does. So we do Borrowing clos_osm from python igraph and the
As labelled in the histogram points from 0, we can use these directly as node indices! By adding
vertex x and y positions - times y by -1 to flip for Cairo plotting. We Finally add edge weight. If
we plot the network with a graph lawet, we won't get any sense of what it is , so we added 'x' and
'y' coordinates to the vertices, igraph knowns how to plot spatially. Now get rgba values, but
throw away alpha, and map to a list of tuples and a degree of distribution is plotted
Nodal removal
Nodes are removed according to a selected property: e.g. degree, closeness centrality and
betweenness centrality and about the effects on the network of removing such nodes which have
more impact on the network first we add in some geographic locations for this plot and tidy up
the nodes and multiply y-axis coordinates by -1 as Cairo plots from top left, not bottom left.
Some pairs of stations appear multiple times, e.g. node 11 - due to the different lines that pass
through the station To get all instances, due to issues with spaces etc. we use pandas which does
this nicely with string functions. means that the pattern 'Baker' must occur at the beginning of
Assign group ids to groups found by the fast greedy modularity algorithm. Assign group ids to
positions assigning spatial locations to the points and it Now looks like the original shapefile
plot!
Compute betweenness - for multigraphs networkx only computes the topological betweennes .
This happens even if the weight parameter is set ls add the names of the stations to the graph our
multigraph will have several pairs stations connected by multiple lines. We can simplify plo the
Graph to remove duplicate edges. If we want to calculate weighted betweenness, specify weight,
the default is none.Adding distances may change shortest paths compared to the topological case
f, ax = plt.subplots(figsize=(16,10))
The stations represent the possible nodes we have to cast the ids as strings as otherwise they get
assigned as node ids, which take the edge list and add edges. Agin as string to match names.
Finally add edge weights and Now, a basic plot. Now we want to extract the actual station
locations from these data. We'd like the id and name of each station, and the starting point of
the line representing the connection. Here's an approximation of the way which uses tidy() from
broom.
Scenarios
Now that we have calibrated our parameters and produced some estimates, we can start
to play around with some what-if scenarios. In a 'what if' scenario, we make the
assumption that the parameters for alpha and beta are universal (that is - they don't
change subject to circumstance), and we use this model as a basis for exploring different
scenarios by changing other data in the model, such as the observed $W_{j}$ values,
here median income at the destination (the attractive force), or the cost of travelling
between two places. As we're using straight-line distance to equal cost here it doesn't
make a great deal of sense to change that in our model though! So, by way of example -
What if the government invested loads of money into a new Car Plant in Barking and
recall, the estimates from our unconstrained model, none of the estimates summed to
the observed in and out-flow totals. Our estiamtes only summed to the grand total of
flows, but this is because we were really fitting a ‘total constrained’ model which used
There are, of course, plenty of things we could try out. For example:
we could try mapping the coefficients or the residual values from the model to see if
there is any patterning in either the over or under prediction of flows. we could try
running wer own version of a LUTI model by first calibrating the model parameters and
plugging these into a multiplicative version of the model, adjusting the destination
the row (origin) totals, column (destination) totals or both origin and
that this level of disposable income remains the same. We can then
opens, how much it might affect the business of shops in the town
for the new third runway at Heathrow was given, one option being
where the workers will be drawn from (and their likely travel-to-
work patterns).
B. how the model parameters vary over time - for example how