LS06 Context & Sensors Lecture 3

About this video

This is the final lecture in this lecture set.
Today we looked at how our mobile phone can determine it's location. We looked at Augmented Reality and how such apps work. We looked at the use of active sensing to provide information about the position of e.g. the features of the face (via the iPhone X's TrueDepth camera which supports FaceID).
We looked (very briefly) at the different contexts for applications.
We finished off with a look at some Swift features that we've probably seen in our programs and sample code, without realising what they were.