# Notes of the month, September edition

This was an incredibly exciting month - it kicked of with a retreat to the Isle of Wight and only got better. Currently I'm exploring different use cases for our predictive platform and getting used to the whole not traveling every week life. Next thing you know I might even sign up to a local gym.
Talking about not traveling (too much): Greetings from the machineByte conference in Paris!

Upcoming events

1.10., 6:30pm - Behavioural finance lessons from sports coaching @ CFAUK
https://secure.cfauk.org/events/events-details/index.html?article=MzE0Ng==
8.10. 6:30pm - LQG: Has Value met its Waterloo?
https://mailchi.mp/5485a64c31bb/lqg-2019-10-08-1830-at-goldman-sachs-inigo-fraser-jenkins-has-value-met-its-waterloo?e=e73aa0923d
17.10. 6pm - Validation Methods for Machine Learning @ GARP
https://www.garp.org/#!/membership/chapters/meetings/a1U1W000002H01cUAC
24.10.,  around 7:30pm (tbc) - Guided tour in the science museum, mail me if you'd like to join
30.10., 6:30pm - How to Speak Machine @ LSE
http://www.lse.ac.uk/Events/2019/10/20191030t1830vCBA/how-to-speak-machine

Free Satellite Data + getting roads from satellite images

The imaging and spectral data of europe's Sentinel satellites (10m resultion) is made available on the Open Access Hub for free. It's a brilliant source for your ML for image processing experiments. One interesting example is the OSM+Facebook project, where they identified streets from satellite images to improve the quality of the OSM maps in less digitally connected areas.
The data: https://scihub.copernicus.eu/dhus/#/home

Jeff Dean - on building intelligent systems

Here is an interesting talk of Jeff Dean (of Google fame) on using machine learning to build intelligent systems. And, as a bonus, there is also a collection of Jeff Dean facts (similar to Chuck Norris'), with gems such as:
- Jeff Dean puts his pants on one leg at a  time, but if he had more than two legs, you would see that his approach  is actually O(log n)
- Unsatisfied with constant time, Jeff Dean created the world's first O(1/N)algorithm.
Jeff Dean facts: http://www.informatika.bg/jeffdean
The video: https://youtu.be/4skC-kGCYRA

Reversible Neural Networks

There is a variation of deep residual networks, where the computations in the neural network can be reversed. This means you don't have to store the activations and you can reduce the memory footprint, without sacrificing performance.
The paper: https://papers.nips.cc/paper/6816-the-reversible-residual-network-backpropagation-without-storing-activations.pdf

Reading Next: What would you propose?

I'd be interested in expanding my reading list a little bit - do you have any proposals? If no better ideas come along, I'll probably go with Pearl's Causality or Tetlock's Superforecasting.
https://www.amazon.de/Superforecasting-Science-Prediction-Philip-Tetlock/dp/1847947158
https://www.amazon.de/Causality-Judea-Pearl/dp/052189560X