Android 手机方向概览,包括指南针

我一直想搞清楚 Android 的定位传感器。 我以为我明白了。然后我意识到我没有。现在我想(希望)我又有了一个更好的感觉,但我仍然没有100% 。我会试着解释我对它的不完整的理解,希望人们能够纠正我,如果我在某些部分错误或填补任何空白。

我想象我站在经度0度(本初子午线)和纬度0度(赤道)。这个位置实际上是在非洲海岸附近的海域,但是请容许我这样说。我把手机放在面前,手机底部指向我的脚; 我面朝北(看向格林威治) ,因此手机的右手边指向东方的非洲。在这个方向(参考下图) ,X 轴指向东方,Z 轴指向南方,Y 轴指向天空。

现在,手机上的传感器允许你在这种情况下计算出设备的方向(而不是位置)。这个部分一直困扰着我,可能是因为我想在接受它确实有效之前了解它是如何工作的。手机似乎是通过两种不同技术的结合来确定方向的。

Before I get to that, imagine being back standing on that imaginary piece of land at 0 degrees latitude and longitude standing in the direction mentioned above. Imagine also that you are blindfolded and your shoes are fixed to a playground roundabout. If someone shoves you in the back you will fall forward (toward North) and put both hands out to break your fall. Similarly if someone shoves you left shoulder you will fall over on your right hand. Your inner ear has "gravitational sensors" (youtube 剪辑) which allow you to detect if you are falling forward/back, or falling left/right or falling down (or up!!). Therefore humans can detect alignment and rotation around the the same X and Z axes as the phone.

现在想象一下,有人在环形交叉路口把你旋转90度,这样你就面朝东了。您正在围绕 Y 轴旋转。这条轴线不同,因为我们无法从生物学上探测到它。我们知道我们的角度是一定的,但是我们不知道我们的方向与行星的地磁北极有关。 相反,我们需要使用一个外部工具... 一个磁罗盘。这使我们能够确定我们所面对的方向。我们的手机也是如此。

现在这款手机还有一个三轴加速度计。我知道它们实际上是如何工作的,但是我想象它的方式是把重力想象成恒定的、均匀的从天而降的“雨”,把上图中的轴想象成能够探测雨水流量的管子。当手机垂直举起时,所有的雨水都会流过 Y 型管。如果手机逐渐旋转,使其屏幕面向天空,那么通过 Y 的雨量将减少到零,而通过 Z 的雨量将稳步增加,直到雨量达到最大值。同样地,如果我们现在把手机倾斜到它的一边,X 管最终将收集最大量的雨水。因此,取决于手机的方向,通过测量雨量流过3管,你可以计算方向。

这款手机还有一个电子罗盘,表现得像一个普通的罗盘——它的“虚拟指针”指向磁北。Android 合并来自这两个传感器的信息,以便每当生成 TYPE_ORIENTATIONSensorEvent时,values[3]阵列就有
值[0] : 方位角-(指南针,方位于磁北以东)
值[1] : 音高,围绕 x 轴旋转(手机向前或向后倾斜)
值[2] : 滚动,围绕 y 轴旋转(手机倾斜在左侧或右侧)

所以我认为(我不知道) Android 给出方位角(指南针方位)而不是第三个加速度计的读数的原因是指南针方位更有用。我不知道为什么他们反对这种类型的传感器,因为现在似乎你需要为 TYPE_MAGNETIC_FIELD类型的 SensorEvent系统注册一个监听器。事件的 value[]数组需要传递到 SensorManger.getRotationMatrix(..)方法中,以获得旋转矩阵(见下文) ,然后将其传递到 SensorManager.getOrientation(..)方法中。 有人知道为什么 Android 团队反对 Sensor.TYPE_ORIENTATION吗?是为了效率吗?这是对类似 有个问题的注释之一所暗示的,但是在 Development/sample/Compass/src/com/example/android/Rome/CompassActivity.java 开发/示例/CompassActivity.java示例中仍然需要注册不同类型的侦听器。

现在我想谈谈旋转矩阵。(这是我最不确定的地方) 因此,上面我们有来自 Android 文档的三个数字,我们称它们为 A、 B 和 C。

A = sensorManger.getRotationMatrix (. .)方法图形,代表世界的坐标系

B = 传感器事件应用程序接口所使用的坐标系。

C = SensorManager.getOrient(. .)方法图

所以我的理解是,a 代表“世界的坐标系”,我猜想它指的是地球上的位置是一对(纬度,经度)夫妇和一个可选的(高度)。X 是 “东方”坐标,Y 是 “没什么”坐标。Z 指向天空,代表高度。

电话坐标系如图 B 所示是固定的。它的 Y 轴总是指向顶部。旋转矩阵是由手机不断计算,并允许两者之间的映射。那么,我认为旋转矩阵将 b 的坐标系转换为 c 的想法是正确的吗?因此,当您调用 SensorManager.getOrientation(..)方法时,您将使用具有对应于图 C 的值的 values[]数组。 当手机指向天空时,旋转矩阵是恒等矩阵(矩阵的数学等价物为1) ,这意味着不需要映射,因为手机与世界坐标系对齐。

好吧。我想我最好现在就停止。就像我之前说过的,我希望人们能告诉我我在哪里搞砸了,或者帮助了别人(或者让别人更加困惑!)

47107 次浏览

Have a look at this: Stackoverflow.com: Q.5202147

You seem to be mostly right until the 3 diagrams A,B,C. After that you have got yourself confused.

You might want to check out the One Screen Turn Deserves Another article. It explains why you need the rotation matrix.

In a nutshell, the phone's sensors always use the same coordinate system, even when the device is rotated.

In applications that are not locked to a single orientation, the screen coordinate system changes when you rotate the device. Thus, when the device is rotated from its default view mode, the sensor coordinate system is no longer the same as the screen coordinate system. The rotation matrix in this case is used to transform A to C (B always remains fixed).

Here's a code snippet to show you how it can be used.

SensorManager sm = (SensorManager) getSystemService(SENSOR_SERVICE);


// Register this class as a listener for the accelerometer sensor
sm.registerListener(this, sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_NORMAL);
// ...and the orientation sensor
sm.registerListener(this, sm.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
SensorManager.SENSOR_DELAY_NORMAL);


//...
// The following code inside a class implementing a SensorEventListener
// ...


float[] inR = new float[16];
float[] I = new float[16];
float[] gravity = new float[3];
float[] geomag = new float[3];
float[] orientVals = new float[3];


double azimuth = 0;
double pitch = 0;
double roll = 0;


public void onSensorChanged(SensorEvent sensorEvent) {
// If the sensor data is unreliable return
if (sensorEvent.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE)
return;


// Gets the value of the sensor that has been changed
switch (sensorEvent.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
gravity = sensorEvent.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomag = sensorEvent.values.clone();
break;
}


// If gravity and geomag have values then find rotation matrix
if (gravity != null && geomag != null) {


// checks that the rotation matrix is found
boolean success = SensorManager.getRotationMatrix(inR, I,
gravity, geomag);
if (success) {
SensorManager.getOrientation(inR, orientVals);
azimuth = Math.toDegrees(orientVals[0]);
pitch = Math.toDegrees(orientVals[1]);
roll = Math.toDegrees(orientVals[2]);
}
}
}

Roll is a function of gravity, a 90 degree roll puts all of gravity into the x register.

Pitch is the same, a 90 degree pitch up puts all of the component of gravity into the y register.

Yaw / Heading / azimuth has no effect on gravity, it is ALWAYS at right angles to gravity, hence no matter which way you are facing gravity will be imeasurable.

This is why you need a compass to assess, maybe that makes sense?

I was having this issue so I mapped out what happens in different directions. If the device is mounted in landscape fashion, eg in a car mount the 'degrees' from the compass seem to run from 0-275 (going clockwise) above 269 ( between west and north) it counts backwards from -90 to 0, then forwards from 0 to 269. 270 becomes -90

Still In landscape but with the device lying on its back my sensor gives 0-360. and in portrait mode it runs 0-360 both lying on its back and standing up in portrait.

Hope that helps someone