diff --git a/competition/tex/chapters/components.tex b/competition/tex/chapters/components.tex index a24a0af..cf4fef8 100644 --- a/competition/tex/chapters/components.tex +++ b/competition/tex/chapters/components.tex @@ -1,30 +1,7 @@ \section{Component Description} Our indoor localisation solely uses the sensors provided by almost each commodity smartphone. - The readings of all those sensors are fused using recursive density estimation, directly on the phone: - - \commentByFrank{state beschreiben: x, y, z, heading. oder machst du das schon weiter oben? dann kann vermutlicha uch die formel hier weg} - - \begin{equation} - \arraycolsep=1.2pt - \begin{array}{ll} - &p(\mStateVec_{t} \mid \mObsVec_{1:t}) \propto\\ - &\underbrace{p(\mObsVec_{t} \mid \mStateVec_{t})}_{\text{evaluation}} - \int \underbrace{p(\mStateVec_{t} \mid \mStateVec_{t-1}, \mObsVec_{t-1})}_{\text{transition}} - \underbrace{p(\mStateVec_{t-1} \mid \mObsVec_{1:t-1})d\vec{q}_{t-1}}_{\text{recursion}} \enspace, - \end{array} - \label{eq:recursiveDensity} - \end{equation} - - \docWIFI{} and (if available) \docIBeacon{}s serve as absolute positioning component. If the smartphone provides - a barometer, its measurements are used as an additional, relative verification for the current $z$-component - of the pedestrian's location. - - The transition in \refeq{eq:recursiveDensity} is carried out using random walks on a graph, which is built offline, and uses - the building's floorplan. During the localisation process, the smartphone's IMU (accelerometer, gyroscope) is used to constrain the random walk - in both, distance and heading. - - The recursive density estimation is implemented using a particle-filter. + \input{chapters/barometer.tex} \input{chapters/wifi.tex} diff --git a/competition/tex/chapters/introduction.tex b/competition/tex/chapters/introduction.tex index d1557f3..acb98dd 100644 --- a/competition/tex/chapters/introduction.tex +++ b/competition/tex/chapters/introduction.tex @@ -2,34 +2,59 @@ The navigation system is based on our previous works, primarily on the approach presented in \cite{ebner-15}. For this, we have been awarded the best overall paper award at IPIN 2015 in Banff, Canada. -Since then, we extended our approach by prior navigation knowledge using realistic human walking paths \cite{} and smoothing methods \cite{}. +Since then, we extended our approach by prior navigation knowledge using realistic human walking paths \cite{ebner-16} and smoothing methods \cite{fetzer-16}. Additionally, a self-developed map editor allows for creating advanced 3D maps and realistically shaped stairs. Compared to many other systems, we avoid any time-consuming fingerprinting and calibration processes and are able to start with a uniform distribution over the whole building. -All calculations are computed in real time on a commercial smartphone, in most of our examples this is the Motorola Nexus 6. +All calculations are computed in real time on a commercial smartphone, in most of our examples this is the Motorola Nexus 6 or the Samsung Galaxy S5. The system is implemented in C++ using the Qt framework and OpenCL. An overview of all involved components and the sensor fusion procedure can be seen in fig. \ref{}. Here, the smartphone provides all necessary measurements and no additional device is needed. +The readings of all those sensors are fused using recursive density estimation, directly on the phone: +% +\begin{equation} + \arraycolsep=1.2pt + \begin{array}{ll} + &p(\mStateVec_{t} \mid \mObsVec_{1:t}) \propto\\ + &\underbrace{p(\mObsVec_{t} \mid \mStateVec_{t})}_{\text{evaluation}} + \int \underbrace{p(\mStateVec_{t} \mid \mStateVec_{t-1}, \mObsVec_{t-1})}_{\text{transition}} + \underbrace{p(\mStateVec_{t-1} \mid \mObsVec_{1:t-1})d\vec{q}_{t-1}}_{\text{recursion}} \enspace, + \end{array} + \label{eq:recursiveDensity} +\end{equation} +% +where $\mObsVec_{1:t} = \mObsVec_{1}, \mObsVec_{1}, ..., \mObsVec_{t}$ is a series of observations up to time $t$. + The hidden state $\mStateVec$ is given by + \begin{equation} + \mStateVec = (x, y, z, \mStateHeading, \mStatePressure),\enskip + x, y, z, \mStateHeading, \mStatePressure \in \R \enspace, + \end{equation} + % + where $x, y, z$ represent the position in 3D space, $\mStateHeading$ the user's heading and $\mStatePressure$ the relative atmospheric pressure prediction in hectopascal (hPa). + The recursive part of the density estimation contains all information up to time $t-1$. + Furthermore, the state transition $p(\mStateVec_{t} \mid \mStateVec_{t-1}, \mObsVec_{t-1})$ models the pedestrian's movement and is carried out using random walks on a graph, which is built offline, and uses the building's floorplan \cite{ebner-16}. + + Containing all relevant sensor measurements to evaluate the current state, the observation vector is defined as follows: + % + \begin{equation} + \mObsVec = (\mRssiVec_\text{wifi}, \mRssiVec_\text{ib}, \mObsHeading, \mObsSteps, \mObsPressure) \enspace, + \end{equation} + % + where $\mRssiVec_\text{wifi}$ and $\mRssiVec_\text{ib}$ contain the measurements of all nearby \docAP{}s (\docAPshort{}) + and \docIBeacon{}s, respectively. Both serve as absolute positioning component. $\mObsHeading$ and $\mObsSteps$ describe the relative angular change and the number of steps detected for the pedestrian. + If the smartphone provides a barometer, $\mObsPressure$ is used as an additional, relative verification for the current $z$-component of the pedestrian's location. +% +The recursive density estimation of eq. \eqref{eq:recursiveDensity} is implemented using a particle-filter with the state transition as proposal density. +This ensures valid position estimations even if a sensor is defect or is not provided by the smartphone itself. -This ensures sensor is defect or is not provided by the smartphone itself -\begin{itemize} - \item Hinfuehren zum System - \item aus welchen arbeiten fuegt sich das system zusammen? - \item grober ueberblick ueber die einzelnen komponenten und sensoren - \item modulare uebersicht ueber das gesamte system. (denis bild + smoothing und prior) - \item particle filter mit formeluebersicht und was fusioniert wird +\section{Prior Arrangements} +System setup is very easily and no fingerprinting is required. \begin{figure}[h!] \centering% \includegraphics[trim=99 0 0 0, clip, width=8.2cm]{editor1.png}% \end{figure} -\end{itemize} - -\cite{ebner-15} - -\section{Prior Arrangements} -System setup is very easily and no fingerprinting is required. \begin{itemize} \item Map building: Grobe Beschreibung, Funktionen und Moeglichkeiten des Map Builders. bildchen diff --git a/competition/tex/egbib.bib b/competition/tex/egbib.bib index 33e042d..3e223ad 100644 --- a/competition/tex/egbib.bib +++ b/competition/tex/egbib.bib @@ -1725,7 +1725,7 @@ doi={10.1109/PLANS.2008.4570051},} pages={1-10}, } -@inproceedings{Ebner-16, +@inproceedings{ebner-16, author={Ebner, Frank and Fetzer, Toni and Grzegorzek, Marcin and Deinzer, Frank}, booktitle={19th Int. Conf. on Information Fusion (FUSION)}, title={{On Prior Navigation Knowledge in Multi Sensor @@ -2734,4 +2734,14 @@ volume = {3}, year = {2009} } +@inproceedings{fetzer-16, + author={Fetzer, Toni and Ebner, Frank and K{\"o}ping, Lukas and Grzegorzek, Marcin and Deinzer, Frank}, + booktitle={Indoor Positioning and Indoor Navigation (IPIN), Int. Conf. on}, + title={{On Monte Carlo Smoothing in Multi Sensor Indoor Localisation}}, + year={2016}, + IGNOREmonth={October}, + pages={1-8}, +} + +