1098:
homogeneous sources of sensory data to achieve more accurate and synthetic readings. When portable devices are employed data compression represent an important factor, since collecting raw information from multiple sources generates huge information spaces that could define an issue in terms of memory or communication bandwidth for portable systems. Data level information fusion tends to generate big input spaces, that slow down the decision-making procedure. Also, data level fusion often cannot handle incomplete measurements. If one sensor modality becomes useless due to malfunctions, breakdown or other reasons the whole systems could occur in ambiguous outcomes.
1015:
687:
1036:
each node delivers independent measures of the same properties. This configuration can be used in error correction when comparing information from multiple nodes. Redundant strategies are often used with high level fusions in voting procedures. Complementary configuration occurs when multiple information sources supply different information about the same features. This strategy is used for fusing information at raw data level within decision-making algorithms. Complementary features are typically applied in motion recognition tasks with
1106:
preliminary data- or feature level processing. The main goal in decision fusion is to use meta-level classifier while data from nodes are preprocessed by extracting features from them. Typically decision level sensor fusion is used in classification an recognition activities and the two most common approaches are majority voting and Naive-Bayes. Advantages coming from decision level fusion include communication bandwidth and improved decision accuracy. It also allows the combination of heterogeneous sensors.
20:
1010:{\displaystyle {\textbf {L}}_{k}={\begin{bmatrix}{\tfrac {\scriptstyle \sigma _{2}^{2}{\textbf {P}}_{k}}{\scriptstyle \sigma _{2}^{2}{\textbf {P}}_{k}+\scriptstyle \sigma _{1}^{2}{\textbf {P}}_{k}+\scriptstyle \sigma _{1}^{2}\scriptstyle \sigma _{2}^{2}}}&{\tfrac {\scriptstyle \sigma _{1}^{2}{\textbf {P}}_{k}}{\scriptstyle \sigma _{2}^{2}{\textbf {P}}_{k}+\scriptstyle \sigma _{1}^{2}{\textbf {P}}_{k}+\scriptstyle \sigma _{1}^{2}\scriptstyle \sigma _{2}^{2}}}\end{bmatrix}}.}
1048:, clustering methods and other techniques. Cooperative sensor fusion uses the information extracted by multiple independent sensors to provide information that would not be available from single sensors. For example, sensors connected to body segments are used for the detection of the angle between them. Cooperative sensor strategy gives information impossible to obtain from single nodes. Cooperative information fusion can be used in motion recognition,
1102:
load. Obviously, it is important to properly select features on which to define classification procedures: choosing the most efficient features set should be a main aspect in method design. Using features selection algorithms that properly detect correlated features and features subsets improves the recognition accuracy but large training sets are usually required to find the most significant feature subset.
1028:
In sensor fusion, centralized versus decentralized refers to where the fusion of the data occurs. In centralized fusion, the clients simply forward all of the data to a central location, and some entity at the central location is responsible for correlating and fusing the data. In decentralized, the
1135:
approach to determine the traffic state (low traffic, traffic jam, medium flow) using road side collected acoustic, image and sensor data. In the field of autonomous driving, sensor fusion is used to combine the redundant information from complementary sensors in order to obtain a more accurate and
1035:
Another classification of sensor configuration refers to the coordination of information flow between sensors. These mechanisms provide a way to resolve conflicts or disagreements and to allow the development of dynamic sensing strategies. Sensors are in redundant (or competitive) configuration if
1101:
Feature level - features represent information computed on board by each sensing node. These features are then sent to a fusion node to feed the fusion algorithm. This procedure generates smaller information spaces with respect to the data level fusion, and this is better in terms of computational
1097:
Data level - data level (or early) fusion aims to fuse raw data from multiple sources and represent the fusion technique at the lowest level of abstraction. It is the most common sensor fusion technique in many fields of application. Data level fusion algorithms usually aim to combine multiple
1105:
Decision level - decision level (or late) fusion is the procedure of selecting an hypothesis from a set of hypotheses generated by individual (usually weaker) decisions of multiple nodes. It is the highest level of abstraction and uses the information that has been already elaborated through
1092:
Sensor fusion level can also be defined basing on the kind of information used to feed the fusion algorithm. More precisely, sensor fusion can be performed fusing raw data coming from different sources, extrapolated features or even decision made by single nodes.
533:
1768:
Parisi, Federico; Ferrari, Gianluigi; Giuberti, Matteo; Contin, Laura; Cimolin, Veronica; Azzaro, Corrado; Albani, Giovanni; Mauro, Alessandro (2016). "Inertial BSN-Based
Characterization and Automatic UPDRS Evaluation of the Gait Task of Parkinsonians".
633:
41:
has less uncertainty than would be possible if these sources were used individually. For instance, one could potentially obtain a more accurate location estimate of an indoor object by combining multiple data sources such as video cameras and
1915:
Chia
Bejarano, Noelia; Ambrosini, Emilia; Pedrocchi, Alessandra; Ferrigno, Giancarlo; Monticone, Marco; Ferrante, Simona (2015). "A Novel Adaptive, Real-Time Algorithm to Detect Gait Events From Wearable Sensors".
2133:
Blasch, E., Steinberg, A., Das, S., Llinas, J., Chong, C.-Y., Kessler, O., Waltz, E., White, F. (2013) "Revisiting the JDL model for information
Exploitation," International Conference on Information Fusion.
2319:
Maria, Aileni Raluca; Sever, Pasca; Carlos, Valderrama (2015). "Biomedical sensors data fusion algorithm for enhancing the efficiency of fault-tolerant systems in case of wearable electronics device".
2189:
Gao, Teng; Song, Jin-Yan; Zou, Ji-Yan; Ding, Jin-Hua; Wang, De-Quan; Jin, Ren-Cheng (2015). "An overview of performance trade-off mechanisms in routing protocol for green wireless sensor networks".
1020:
By inspection, when the first measurement is noise free, the filter ignores the second measurement and vice versa. That is, the combined estimate is weighted by the quality of the measurements.
416:
373:
339:
1029:
clients take full responsibility for fusing the data. "In this case, every sensor or platform can be viewed as an intelligent asset having some degree of autonomy in decision-making."
542:
635:
is the variance of the combined estimate. It can be seen that the fused result is simply a linear combination of the two measurements weighted by their respective noise variances.
671:
404:
302:
271:
2146:
Gravina, Raffaele; Alinia, Parastoo; Ghasemzadeh, Hassan; Fortino, Giancarlo (2017). "Multi-sensor fusion in body sensor networks: State-of-the-art and research challenges".
1515:
Li, Wenfeng; Bao, Junrong; Fu, Xiuwen; Fortino, Giancarlo; Galzarano, Stefano (2012). "Human
Postures Recognition Based on D-S Evidence Theory and Multi-sensor Data Fusion".
2360:
Gross, Jason; Yu Gu; Matthew Rhudy; Srikanth
Gururajan; Marcello Napolitano (July 2012). "Flight Test Evaluation of Sensor Fusion Algorithms for Attitude Estimation".
1597:
Tao, Shuai; Zhang, Xiaowei; Cai, Huaying; Lv, Zeping; Hu, Caiyou; Xie, Haiqun (2018). "Gait based biometric personal authentication by using MEMS inertial sensors".
1977:
Wang, Zhelong; Qiu, Sen; Cao, Zhongkai; Jiang, Ming (2013). "Quantitative assessment of dual gait analysis based on inertial sensors with body sensor network".
2061:
2012:
Kong, Weisheng; Wanning, Lauren; Sessa, Salvatore; Zecca, Massimiliano; Magistro, Daniele; Takeuchi, Hikaru; Kawashima, Ryuta; Takanishi, Atsuo (2017).
1311:
Badeli, Vahid; Ranftl, Sascha; Melito, Gian Marco; Reinbacher-Köstinger, Alice; Von Der Linden, Wolfgang; Ellermann, Katrin; Biro, Oszkar (2021-01-01).
1860:"Personalized Multilayer Daily Life Profiling Through Context Enabled Activity Classification and Motion Reconstruction: An Integrated System Approach"
1235:
Haghighat, Mohammad Bagher Akbari; Aghagolzadeh, Ali; Seyedarabi, Hadi (2011). "Multi-focus image fusion for visual sensor networks in DCT domain".
2418:
1815:
Gao, Lei; Bourke, A.K.; Nelson, John (2014). "Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems".
2278:
Banovic, Nikola; Buzali, Tofi; Chevalier, Fanny; Mankoff, Jennifer; Dey, Anind K. (2016). "Modeling and
Understanding Human Routine Behavior".
1358:
Ranftl, Sascha; Melito, Gian Marco; Badeli, Vahid; Reinbacher-Köstinger, Alice; Ellermann, Katrin; von der Linden, Wolfgang (2019-12-31).
1216:
2235:
Chen, Chen; Jafari, Roozbeh; Kehtarnavaz, Nasser (2015). "A survey of depth and inertial sensor fusion for human action recognition".
1360:"Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection"
54:
vision (calculation of depth information by combining two-dimensional images from two cameras at slightly different viewpoints).
1131:. This is useful, for example, in determining the attitude of an aircraft using low-cost sensors. Another example is using the
2336:
1573:
1532:
50:
in this case can mean more accurate, more complete, or more dependable, or refer to the result of an emerging view, such as
1709:"A Method for Extracting Temporal Parameters Based on Hidden Markov Models in Body Sensor Networks With Inertial Sensors"
2072:
Blasch, E., Plano, S. (2003) “Level 5: User
Refinement to aid the Fusion Process”, Proceedings of the SPIE, Vol. 5099.
528:{\displaystyle {\textbf {x}}_{3}=\sigma _{3}^{2}(\sigma _{1}^{-2}{\textbf {x}}_{1}+\sigma _{2}^{-2}{\textbf {x}}_{2})}
2295:
1499:
158:
344:
310:
57:
The data sources for a fusion process are not specified to originate from identical sensors. One can distinguish
628:{\displaystyle \scriptstyle \sigma _{3}^{2}=(\scriptstyle \sigma _{1}^{-2}+\scriptstyle \sigma _{2}^{-2})^{-1}}
143:
2434:"Stabilization and Validation of 3D Object Position Using Multimodal Sensor Fusion and Semantic Segmentation"
66:
2402:
1556:
Fortino, Giancarlo; Gravina, Raffaele (2015). "Fall-MobileGuard: a Smart Real-Time Fall
Detection System".
1317:
COMPEL - the
International Journal for Computation and Mathematics in Electrical and Electronic Engineering
1157:
1140:
223:
2120:
2107:
113:
2574:"Discriminant Correlation Analysis: Real-Time Feature Level Fusion for Multimodal Biometric Recognition"
1708:
1124:
645:
407:
378:
276:
245:
218:
65:
and fusion of the outputs of the former two. Direct fusion is the fusion of sensor data from a set of
1120:
187:
125:
79:
2616:
2412:
2092:
1858:
Xu, James Y.; Wang, Yan; Barrett, Mick; Dobkin, Bruce; Pottie, Greg J.; Kaiser, William J. (2016).
1183:
95:
2621:
2087:
1313:"Bayesian inference of multi-sensors impedance cardiography for detection of aortic dissection"
1128:
1045:
162:
2013:
1517:
2012 12th IEEE/ACM International
Symposium on Cluster, Cloud and Grid Computing (ccgrid 2012)
1144:
2407:. Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence.
2504:
2445:
2369:
2108:
Sensor, user, mission (SUM) resource management and their interaction with level 2/3 fusion
1707:
Guenterberg, E.; Yang, A.Y.; Ghasemzadeh, H.; Jafari, R.; Bajcsy, R.; Sastry, S.S. (2009).
1653:
1147:
with hundreds of bands ) and fuse relevant information to produce classification results.
8:
1041:
77:, and history values of sensor data, while indirect fusion uses information sources like
2508:
2449:
2373:
1657:
1642:"IMU-Based Gait Recognition Using Convolutional Neural Networks and Multi-Sensor Fusion"
70:
2593:
2527:
2492:
2468:
2433:
2385:
2342:
2301:
2260:
2214:
2171:
2044:
1959:
1897:
1794:
1747:
1684:
1641:
1622:
1579:
1538:
1472:
1394:
1359:
1340:
1252:
1172:
2280:
Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI '16
1248:
2532:
2473:
2332:
2291:
2252:
2206:
2163:
2036:
1994:
1951:
1943:
1889:
1881:
1840:
1832:
1786:
1739:
1731:
1689:
1671:
1614:
1569:
1528:
1495:
1464:
1399:
1381:
1344:
1332:
1293:
43:
2597:
2403:
Joshi, V., Rajamani, N., Takayuki, K., Prathapaneni, N., Subramaniam, L. V. (2013).
2346:
2264:
2218:
2175:
2048:
1963:
1901:
1828:
1798:
1626:
1583:
1476:
1256:
2626:
2585:
2522:
2512:
2463:
2453:
2377:
2324:
2283:
2244:
2198:
2155:
2028:
1986:
1933:
1925:
1871:
1824:
1778:
1751:
1723:
1679:
1661:
1606:
1561:
1542:
1520:
1456:
1389:
1371:
1324:
1283:
1270:
Li, Wangyan; Wang, Zidong; Wei, Guoliang; Ma, Lifeng; Hu, Jun; Ding, Derui (2015).
1244:
1162:
678:
674:
228:
213:
204:
Sensor fusion is a term that covers a number of methods and algorithms, including:
2389:
2305:
1565:
410:, which is also employed within the Fraser-Potter fixed-interval smoother, namely
2493:"A Hyperspectral Image Classification Framework with Spatial Pixel Pair Features"
2159:
1053:
193:
132:
1328:
1143:
based methods can simultaneously process many channels of sensor data (such as
1064:
There are several categories or levels of sensor fusion that are commonly used.
2328:
2321:
2015 Conference Grid, Cloud & High Performance Computing in Science (ROLCG)
1929:
1782:
1460:
1037:
2589:
2381:
2248:
2202:
1990:
1876:
1859:
1727:
1610:
1312:
681:
within the gain calculation it can be found that the filter gain is given by:
2610:
2256:
2210:
2167:
2040:
2032:
1998:
1947:
1885:
1836:
1790:
1735:
1675:
1618:
1468:
1385:
1336:
1297:
1049:
639:
208:
108:
2287:
1524:
1447:
Durrant-Whyte, Hugh F. (2016). "Sensor Models and Multisensor Integration".
1272:"A Survey on Multisensor Fusion and Consensus Filtering for Sensor Networks"
19:
2536:
2477:
1955:
1893:
1844:
1743:
1693:
1403:
1193:
1178:
638:
Another (equivalent) method to fuse two measurements is to use the optimal
148:
138:
2082:
J. Llinas; C. Bowman; G. Rogova; A. Steinberg; E. Waltz; F. White (2004).
1558:
Proceedings of the 10th EAI International Conference on Body Area Networks
1288:
1271:
2359:
1188:
1167:
1132:
171:
89:
74:
51:
38:
23:
2553:
1938:
1433:
2081:
1914:
1434:"Multi-sensor management for information fusion: issues and approaches"
2572:
Haghighat, Mohammad; Abdel-Mottaleb, Mohamed; Alhalabi, Wadee (2016).
2517:
2458:
1666:
1376:
1357:
1310:
1032:
Multiple combinations of centralized and decentralized systems exist.
1640:
Dehzangi, Omid; Taherisadr, Mojtaba; ChangalVala, Raghvendar (2017).
642:. Suppose that the data is generated by a first-order system and let
2573:
2491:
Ran, Lingyan; Zhang, Yanning; Wei, Wei; Zhang, Qilin (2017-10-23).
2432:
Mircea Paul, Muresan; Ion, Giosan; Sergiu, Nedevschi (2020-02-18).
305:
182:
2405:
Information Fusion Based Learning for Frugal Traffic State Sensing
1918:
IEEE Transactions on Neural Systems and Rehabilitation Engineering
1196:(TML) is an XML based markup language which enables sensor fusion.
1139:
Although technically not a dedicated sensor fusion method, modern
37:
data or data derived from disparate sources so that the resulting
1116:
2014:"Step Sequence and Direction Detection of Four Square Step Test"
1706:
1224:. Vienna, Austria: Vienna University of Technology. p. 173.
2571:
2145:
1234:
34:
239:
Two example sensor fusion calculations are illustrated below.
2277:
1639:
176:
153:
120:
375:, respectively. One way of obtaining a combined measurement
1767:
1908:
1716:
IEEE Transactions on Information Technology in Biomedicine
1071:
Level 1 – Entity assessment (e.g. signal/feature/object).
1127:
data is fused using various different methods, e.g. the
1074:
Tracking and object detection/recognition/identification
2578:
IEEE Transactions on Information Forensics and Security
2558:
1599:
Journal of Ambient Intelligence and Humanized Computing
1492:
eMaintenance: Essential Electronic Tools for Efficiency
1633:
973:
957:
924:
891:
860:
858:
832:
816:
783:
750:
719:
717:
713:
590:
568:
546:
348:
314:
2362:
IEEE Transactions on Aerospace and Electronic Systems
2234:
2011:
1085:
Level 4 – Process refinement (i.e. sensor management)
690:
648:
545:
419:
381:
347:
313:
279:
248:
2431:
2141:
2139:
1763:
1761:
1857:
1700:
1431:
1218:
Sensor Fusion in Time-Triggered Systems, PhD Thesis
1023:
2086:. International Conference on Information Fusion.
1549:
1009:
665:
627:
527:
398:
367:
333:
296:
265:
2271:
2230:
2228:
2136:
2110:" International Conference on Information Fusion.
1864:IEEE Journal of Biomedical and Health Informatics
1758:
83:knowledge about the environment and human input.
2608:
2490:
2318:
1810:
1808:
1440:
2005:
1976:
1814:
1175:for combining independent tests of significance
16:Combining of sensor data from disparate sources
2312:
2225:
2188:
2121:"Harnessing the full power of sensor fusion -"
1555:
1514:
1449:The International Journal of Robotics Research
1805:
1596:
1446:
2417:: CS1 maint: multiple names: authors list (
1590:
1136:reliable representation of the environment.
368:{\displaystyle \scriptstyle \sigma _{2}^{2}}
334:{\displaystyle \scriptstyle \sigma _{1}^{2}}
2559:International Society of Information Fusion
2182:
1970:
1436:. Information Fusion. p. 3(2):163–186.
1416:
1269:
1851:
1489:
1419:Stochastic Models, Estimating, and Control
1214:
304:denote two sensor measurements with noise
2526:
2516:
2467:
2457:
2091:
1937:
1875:
1683:
1665:
1508:
1393:
1375:
1287:
1771:IEEE Transactions on Affective Computing
18:
2554:Discriminant Correlation Analysis (DCA)
2084:Revisiting the JDL data fusion model II
1276:Discrete Dynamics in Nature and Society
234:
2609:
1237:Computers & Electrical Engineering
102:
165:, the largest sensor ever to be built
2021:IEEE Robotics and Automation Letters
1483:
1410:
1115:One application of sensor fusion is
673:denote the solution of the filter's
2565:
2484:
2425:
943:
910:
879:
802:
769:
738:
694:
652:
511:
476:
423:
385:
283:
252:
13:
1490:Galar, Diego; Kumar, Uday (2017).
14:
2638:
2547:
2237:Multimedia Tools and Applications
2062:Rethinking JDL Data Fusion Levels
1817:Medical Engineering & Physics
1421:. River Edge, NJ: Academic Press.
1249:10.1016/j.compeleceng.2011.04.016
666:{\displaystyle {\textbf {P}}_{k}}
399:{\displaystyle {\textbf {x}}_{3}}
297:{\displaystyle {\textbf {x}}_{2}}
266:{\displaystyle {\textbf {x}}_{1}}
1024:Centralized versus decentralized
2396:
2353:
2127:
2113:
2100:
2075:
2066:
2055:
1829:10.1016/j.medengphy.2014.02.012
1110:
86:Sensor fusion is also known as
1494:. Academic Press. p. 26.
1432:N. Xiong; P. Svensson (2002).
1425:
1351:
1304:
1263:
1228:
1208:
1079:Level 2 – Situation assessment
610:
565:
522:
452:
1:
1566:10.4108/eai.28-9-2015.2261462
1201:
199:
2160:10.1016/j.inffus.2016.09.005
1141:Convolutional neural network
224:Convolutional neural network
33:is the process of combining
7:
1329:10.1108/COMPEL-03-2021-0072
1150:
1082:Level 3 – Impact assessment
114:Electronic Support Measures
10:
2643:
2329:10.1109/ROLCG.2015.7367228
1930:10.1109/TNSRE.2014.2337914
1783:10.1109/TAFFC.2016.2549533
1461:10.1177/027836498800700608
1194:Transducer Markup Language
1158:Brooks – Iyengar algorithm
1125:inertial navigation system
408:inverse-variance weighting
2590:10.1109/TIFS.2016.2569061
2382:10.1109/TAES.2012.6237583
2249:10.1007/s11042-015-3177-1
2203:10.1007/s11276-015-0960-x
1991:10.1108/02602281311294342
1877:10.1109/JBHI.2014.2385694
1728:10.1109/TITB.2009.2028421
1611:10.1007/s12652-018-0880-6
1121:Global Positioning System
1088:Level 5 – User refinement
1059:
126:Global Positioning System
2033:10.1109/LRA.2017.2723929
1068:Level 0 – Data alignment
2288:10.1145/2858036.2858557
1525:10.1109/CCGrid.2012.144
1215:Elmenreich, W. (2002).
161:, such as the proposed
1184:Multimodal integration
1129:extended Kalman filter
1046:Support-vector machine
1011:
667:
629:
529:
400:
369:
335:
298:
267:
163:Square Kilometre Array
27:
1145:Hyperspectral imaging
1012:
668:
630:
530:
401:
370:
336:
299:
268:
48:uncertainty reduction
22:
2282:. pp. 248–260.
1519:. pp. 912–917.
1417:Maybeck, S. (1982).
688:
646:
543:
417:
379:
345:
311:
277:
246:
235:Example calculations
2509:2017Senso..17.2421R
2450:2020Senso..20.1110M
2374:2012ITAES..48.2128G
2106:Blasch, E. (2006) "
1658:2017Senso..17.2735D
1289:10.1155/2015/683701
1042:Hidden Markov model
988:
972:
939:
906:
875:
847:
831:
798:
765:
734:
608:
586:
561:
507:
472:
451:
363:
329:
103:Examples of sensors
93:and is a subset of
2148:Information Fusion
1007:
998:
994:
992:
991:
990:
989:
974:
958:
925:
892:
890:
861:
853:
851:
850:
849:
848:
833:
817:
784:
751:
749:
720:
663:
625:
624:
623:
622:
591:
569:
547:
525:
490:
455:
437:
396:
365:
364:
349:
331:
330:
315:
294:
263:
229:Gaussian processes
179:and other acoustic
96:information fusion
46:signals. The term
28:
2518:10.3390/s17102421
2459:10.3390/s20041110
2338:978-6-0673-7040-9
2191:Wireless Networks
1667:10.3390/s17122735
1575:978-1-63190-084-6
1534:978-1-4673-1395-7
1377:10.3390/e22010058
993:
945:
912:
881:
852:
804:
771:
740:
696:
654:
513:
478:
425:
387:
285:
254:
214:Bayesian networks
44:WiFi localization
2634:
2602:
2601:
2584:(9): 1984–1996.
2569:
2541:
2540:
2530:
2520:
2488:
2482:
2481:
2471:
2461:
2429:
2423:
2422:
2416:
2408:
2400:
2394:
2393:
2368:(3): 2128–2139.
2357:
2351:
2350:
2323:. pp. 1–4.
2316:
2310:
2309:
2275:
2269:
2268:
2243:(3): 4405–4425.
2232:
2223:
2222:
2186:
2180:
2179:
2143:
2134:
2131:
2125:
2124:
2117:
2111:
2104:
2098:
2097:
2095:
2079:
2073:
2070:
2064:
2059:
2053:
2052:
2027:(4): 2194–2200.
2018:
2009:
2003:
2002:
1974:
1968:
1967:
1941:
1912:
1906:
1905:
1879:
1855:
1849:
1848:
1812:
1803:
1802:
1765:
1756:
1755:
1722:(6): 1019–1030.
1713:
1704:
1698:
1697:
1687:
1669:
1637:
1631:
1630:
1605:(5): 1705–1712.
1594:
1588:
1587:
1553:
1547:
1546:
1512:
1506:
1505:
1487:
1481:
1480:
1444:
1438:
1437:
1429:
1423:
1422:
1414:
1408:
1407:
1397:
1379:
1355:
1349:
1348:
1308:
1302:
1301:
1291:
1267:
1261:
1260:
1232:
1226:
1225:
1223:
1212:
1163:Data (computing)
1016:
1014:
1013:
1008:
1003:
1002:
995:
987:
982:
971:
966:
953:
952:
947:
946:
938:
933:
920:
919:
914:
913:
905:
900:
889:
888:
883:
882:
874:
869:
859:
854:
846:
841:
830:
825:
812:
811:
806:
805:
797:
792:
779:
778:
773:
772:
764:
759:
748:
747:
742:
741:
733:
728:
718:
704:
703:
698:
697:
675:Riccati equation
672:
670:
669:
664:
662:
661:
656:
655:
634:
632:
631:
626:
621:
620:
607:
599:
585:
577:
560:
555:
534:
532:
531:
526:
521:
520:
515:
514:
506:
498:
486:
485:
480:
479:
471:
463:
450:
445:
433:
432:
427:
426:
405:
403:
402:
397:
395:
394:
389:
388:
374:
372:
371:
366:
362:
357:
340:
338:
337:
332:
328:
323:
303:
301:
300:
295:
293:
292:
287:
286:
272:
270:
269:
264:
262:
261:
256:
255:
139:Magnetic sensors
2642:
2641:
2637:
2636:
2635:
2633:
2632:
2631:
2617:Robotic sensing
2607:
2606:
2605:
2570:
2566:
2550:
2545:
2544:
2489:
2485:
2430:
2426:
2413:cite conference
2410:
2409:
2401:
2397:
2358:
2354:
2339:
2317:
2313:
2298:
2276:
2272:
2233:
2226:
2187:
2183:
2144:
2137:
2132:
2128:
2123:. 3 April 2024.
2119:
2118:
2114:
2105:
2101:
2080:
2076:
2071:
2067:
2060:
2056:
2016:
2010:
2006:
1975:
1971:
1913:
1909:
1856:
1852:
1813:
1806:
1766:
1759:
1711:
1705:
1701:
1638:
1634:
1595:
1591:
1576:
1554:
1550:
1535:
1513:
1509:
1502:
1488:
1484:
1445:
1441:
1430:
1426:
1415:
1411:
1356:
1352:
1309:
1305:
1268:
1264:
1233:
1229:
1221:
1213:
1209:
1204:
1199:
1173:Fisher's method
1153:
1113:
1062:
1054:motion analysis
1026:
997:
996:
983:
978:
967:
962:
948:
942:
941:
940:
934:
929:
915:
909:
908:
907:
901:
896:
884:
878:
877:
876:
870:
865:
857:
855:
842:
837:
826:
821:
807:
801:
800:
799:
793:
788:
774:
768:
767:
766:
760:
755:
743:
737:
736:
735:
729:
724:
716:
709:
708:
699:
693:
692:
691:
689:
686:
685:
657:
651:
650:
649:
647:
644:
643:
613:
609:
600:
595:
578:
573:
556:
551:
544:
541:
540:
516:
510:
509:
508:
499:
494:
481:
475:
474:
473:
464:
459:
446:
441:
428:
422:
421:
420:
418:
415:
414:
390:
384:
383:
382:
380:
377:
376:
358:
353:
346:
343:
342:
324:
319:
312:
309:
308:
288:
282:
281:
280:
278:
275:
274:
257:
251:
250:
249:
247:
244:
243:
237:
219:Dempster–Shafer
202:
194:List of sensors
172:Seismic sensors
159:Radiotelescopes
133:thermal imaging
105:
88:(multi-sensor)
63:indirect fusion
17:
12:
11:
5:
2640:
2630:
2629:
2624:
2619:
2604:
2603:
2563:
2562:
2561:
2556:
2549:
2548:External links
2546:
2543:
2542:
2483:
2424:
2395:
2352:
2337:
2311:
2296:
2270:
2224:
2197:(1): 135–157.
2181:
2135:
2126:
2112:
2099:
2093:10.1.1.58.2996
2074:
2065:
2054:
2004:
1969:
1924:(3): 413–422.
1907:
1870:(1): 177–188.
1850:
1823:(6): 779–785.
1804:
1777:(3): 258–271.
1757:
1699:
1632:
1589:
1574:
1548:
1533:
1507:
1500:
1482:
1439:
1424:
1409:
1350:
1323:(3): 824–839.
1303:
1262:
1243:(5): 789–797.
1227:
1206:
1205:
1203:
1200:
1198:
1197:
1191:
1186:
1181:
1176:
1170:
1165:
1160:
1154:
1152:
1149:
1112:
1109:
1108:
1107:
1103:
1099:
1090:
1089:
1086:
1083:
1080:
1077:
1076:
1075:
1069:
1061:
1058:
1038:Neural network
1025:
1022:
1018:
1017:
1006:
1001:
986:
981:
977:
970:
965:
961:
956:
951:
937:
932:
928:
923:
918:
904:
899:
895:
887:
873:
868:
864:
856:
845:
840:
836:
829:
824:
820:
815:
810:
796:
791:
787:
782:
777:
763:
758:
754:
746:
732:
727:
723:
715:
714:
712:
707:
702:
677:. By applying
660:
619:
616:
612:
606:
603:
598:
594:
589:
584:
581:
576:
572:
567:
564:
559:
554:
550:
537:
536:
524:
519:
505:
502:
497:
493:
489:
484:
470:
467:
462:
458:
454:
449:
444:
440:
436:
431:
393:
361:
356:
352:
327:
322:
318:
291:
260:
236:
233:
232:
231:
226:
221:
216:
211:
201:
198:
197:
196:
190:
185:
180:
174:
169:
168:Scanning LIDAR
166:
156:
151:
146:
141:
136:
129:
123:
117:
111:
109:Accelerometers
104:
101:
15:
9:
6:
4:
3:
2:
2639:
2628:
2625:
2623:
2622:Computer data
2620:
2618:
2615:
2614:
2612:
2599:
2595:
2591:
2587:
2583:
2579:
2575:
2568:
2564:
2560:
2557:
2555:
2552:
2551:
2538:
2534:
2529:
2524:
2519:
2514:
2510:
2506:
2502:
2498:
2494:
2487:
2479:
2475:
2470:
2465:
2460:
2455:
2451:
2447:
2443:
2439:
2435:
2428:
2420:
2414:
2406:
2399:
2391:
2387:
2383:
2379:
2375:
2371:
2367:
2363:
2356:
2348:
2344:
2340:
2334:
2330:
2326:
2322:
2315:
2307:
2303:
2299:
2297:9781450333627
2293:
2289:
2285:
2281:
2274:
2266:
2262:
2258:
2254:
2250:
2246:
2242:
2238:
2231:
2229:
2220:
2216:
2212:
2208:
2204:
2200:
2196:
2192:
2185:
2177:
2173:
2169:
2165:
2161:
2157:
2153:
2149:
2142:
2140:
2130:
2122:
2116:
2109:
2103:
2094:
2089:
2085:
2078:
2069:
2063:
2058:
2050:
2046:
2042:
2038:
2034:
2030:
2026:
2022:
2015:
2008:
2000:
1996:
1992:
1988:
1984:
1980:
1979:Sensor Review
1973:
1965:
1961:
1957:
1953:
1949:
1945:
1940:
1935:
1931:
1927:
1923:
1919:
1911:
1903:
1899:
1895:
1891:
1887:
1883:
1878:
1873:
1869:
1865:
1861:
1854:
1846:
1842:
1838:
1834:
1830:
1826:
1822:
1818:
1811:
1809:
1800:
1796:
1792:
1788:
1784:
1780:
1776:
1772:
1764:
1762:
1753:
1749:
1745:
1741:
1737:
1733:
1729:
1725:
1721:
1717:
1710:
1703:
1695:
1691:
1686:
1681:
1677:
1673:
1668:
1663:
1659:
1655:
1651:
1647:
1643:
1636:
1628:
1624:
1620:
1616:
1612:
1608:
1604:
1600:
1593:
1585:
1581:
1577:
1571:
1567:
1563:
1559:
1552:
1544:
1540:
1536:
1530:
1526:
1522:
1518:
1511:
1503:
1501:9780128111543
1497:
1493:
1486:
1478:
1474:
1470:
1466:
1462:
1458:
1455:(6): 97–113.
1454:
1450:
1443:
1435:
1428:
1420:
1413:
1405:
1401:
1396:
1391:
1387:
1383:
1378:
1373:
1369:
1365:
1361:
1354:
1346:
1342:
1338:
1334:
1330:
1326:
1322:
1318:
1314:
1307:
1299:
1295:
1290:
1285:
1281:
1277:
1273:
1266:
1258:
1254:
1250:
1246:
1242:
1238:
1231:
1220:
1219:
1211:
1207:
1195:
1192:
1190:
1187:
1185:
1182:
1180:
1177:
1174:
1171:
1169:
1166:
1164:
1161:
1159:
1156:
1155:
1148:
1146:
1142:
1137:
1134:
1130:
1126:
1122:
1118:
1104:
1100:
1096:
1095:
1094:
1087:
1084:
1081:
1078:
1073:
1072:
1070:
1067:
1066:
1065:
1057:
1055:
1051:
1050:gait analysis
1047:
1043:
1039:
1033:
1030:
1021:
1004:
999:
984:
979:
975:
968:
963:
959:
954:
949:
935:
930:
926:
921:
916:
902:
897:
893:
885:
871:
866:
862:
843:
838:
834:
827:
822:
818:
813:
808:
794:
789:
785:
780:
775:
761:
756:
752:
744:
730:
725:
721:
710:
705:
700:
684:
683:
682:
680:
679:Cramer's rule
676:
658:
641:
640:Kalman filter
636:
617:
614:
604:
601:
596:
592:
587:
582:
579:
574:
570:
562:
557:
552:
548:
517:
503:
500:
495:
491:
487:
482:
468:
465:
460:
456:
447:
442:
438:
434:
429:
413:
412:
411:
409:
391:
359:
354:
350:
325:
320:
316:
307:
289:
258:
240:
230:
227:
225:
222:
220:
217:
215:
212:
210:
209:Kalman filter
207:
206:
205:
195:
191:
189:
186:
184:
181:
178:
175:
173:
170:
167:
164:
160:
157:
155:
152:
150:
147:
145:
142:
140:
137:
134:
130:
127:
124:
122:
118:
115:
112:
110:
107:
106:
100:
98:
97:
92:
91:
84:
82:
81:
76:
72:
68:
67:heterogeneous
64:
60:
59:direct fusion
55:
53:
49:
45:
40:
36:
32:
31:Sensor fusion
26:sensor fusion
25:
21:
2581:
2577:
2567:
2503:(10): 2421.
2500:
2496:
2486:
2441:
2437:
2427:
2404:
2398:
2365:
2361:
2355:
2320:
2314:
2279:
2273:
2240:
2236:
2194:
2190:
2184:
2151:
2147:
2129:
2115:
2102:
2083:
2077:
2068:
2057:
2024:
2020:
2007:
1985:(1): 48–56.
1982:
1978:
1972:
1939:11311/865739
1921:
1917:
1910:
1867:
1863:
1853:
1820:
1816:
1774:
1770:
1719:
1715:
1702:
1652:(12): 2735.
1649:
1645:
1635:
1602:
1598:
1592:
1557:
1551:
1516:
1510:
1491:
1485:
1452:
1448:
1442:
1427:
1418:
1412:
1367:
1363:
1353:
1320:
1316:
1306:
1279:
1275:
1265:
1240:
1236:
1230:
1217:
1210:
1179:Image fusion
1138:
1114:
1111:Applications
1091:
1063:
1034:
1031:
1027:
1019:
637:
538:
406:is to apply
241:
238:
203:
192:→Additional
149:Phased array
94:
87:
85:
78:
75:soft sensors
62:
58:
56:
52:stereoscopic
47:
30:
29:
2444:(4): 1110.
1189:Sensor grid
1168:Data mining
1133:data fusion
131:Infrared /
90:data fusion
71:homogeneous
39:information
24:Eurofighter
2611:Categories
1202:References
200:Algorithms
188:TV cameras
2257:1380-7501
2211:1022-0038
2168:1566-2535
2154:: 68–80.
2088:CiteSeerX
2041:2377-3766
1999:0260-2288
1948:1534-4320
1886:2168-2194
1837:1350-4533
1791:1949-3045
1736:1089-7771
1676:1424-8220
1619:1868-5137
1469:0278-3649
1386:1099-4300
1370:(1): 58.
1345:245299500
1337:0332-1649
1298:1026-0226
976:σ
960:σ
927:σ
894:σ
863:σ
835:σ
819:σ
786:σ
753:σ
722:σ
615:−
602:−
593:σ
580:−
571:σ
549:σ
501:−
492:σ
466:−
457:σ
439:σ
351:σ
317:σ
306:variances
183:Sonobuoys
73:sensors,
2598:15624506
2537:29065535
2478:32085608
2347:18782930
2265:18112361
2219:34505498
2176:40608207
2049:23410874
1964:25828466
1956:25069118
1902:16785375
1894:25546868
1845:24636448
1799:16866555
1744:19726268
1694:29186887
1627:52304214
1584:38913107
1477:35656213
1404:33285833
1282:: 1–12.
1257:38131177
1151:See also
1119:, where
80:a priori
2627:Sensors
2528:5677443
2505:Bibcode
2497:Sensors
2469:7070899
2446:Bibcode
2438:Sensors
2370:Bibcode
1752:1829011
1685:5750784
1654:Bibcode
1646:Sensors
1543:1571720
1395:7516489
1364:Entropy
1117:GPS/INS
2596:
2535:
2525:
2476:
2466:
2390:393165
2388:
2345:
2335:
2306:872756
2304:
2294:
2263:
2255:
2217:
2209:
2174:
2166:
2090:
2047:
2039:
1997:
1962:
1954:
1946:
1900:
1892:
1884:
1843:
1835:
1797:
1789:
1750:
1742:
1734:
1692:
1682:
1674:
1625:
1617:
1582:
1572:
1541:
1531:
1498:
1475:
1467:
1402:
1392:
1384:
1343:
1335:
1296:
1255:
1060:Levels
539:where
135:camera
119:Flash
35:sensor
2594:S2CID
2386:S2CID
2343:S2CID
2302:S2CID
2261:S2CID
2215:S2CID
2172:S2CID
2045:S2CID
2017:(PDF)
1960:S2CID
1898:S2CID
1795:S2CID
1748:S2CID
1712:(PDF)
1623:S2CID
1580:S2CID
1539:S2CID
1473:S2CID
1341:S2CID
1253:S2CID
1222:(PDF)
177:Sonar
154:Radar
128:(GPS)
121:LIDAR
116:(ESM)
2533:PMID
2474:PMID
2419:link
2333:ISBN
2292:ISBN
2253:ISSN
2207:ISSN
2164:ISSN
2037:ISSN
1995:ISSN
1952:PMID
1944:ISSN
1890:PMID
1882:ISSN
1841:PMID
1833:ISSN
1787:ISSN
1740:PMID
1732:ISSN
1690:PMID
1672:ISSN
1615:ISSN
1570:ISBN
1529:ISBN
1496:ISBN
1465:ISSN
1400:PMID
1382:ISSN
1333:ISSN
1294:ISSN
1280:2015
1123:and
1056:,,.
341:and
273:and
242:Let
144:MEMS
2586:doi
2523:PMC
2513:doi
2464:PMC
2454:doi
2378:doi
2325:doi
2284:doi
2245:doi
2199:doi
2156:doi
2029:doi
1987:doi
1934:hdl
1926:doi
1872:doi
1825:doi
1779:doi
1724:doi
1680:PMC
1662:doi
1607:doi
1562:doi
1521:doi
1457:doi
1390:PMC
1372:doi
1325:doi
1284:doi
1245:doi
69:or
2613::
2592:.
2582:11
2580:.
2576:.
2531:.
2521:.
2511:.
2501:17
2499:.
2495:.
2472:.
2462:.
2452:.
2442:20
2440:.
2436:.
2415:}}
2411:{{
2384:.
2376:.
2366:48
2364:.
2341:.
2331:.
2300:.
2290:.
2259:.
2251:.
2241:76
2239:.
2227:^
2213:.
2205:.
2195:22
2193:.
2170:.
2162:.
2152:35
2150:.
2138:^
2043:.
2035:.
2023:.
2019:.
1993:.
1983:33
1981:.
1958:.
1950:.
1942:.
1932:.
1922:23
1920:.
1896:.
1888:.
1880:.
1868:20
1866:.
1862:.
1839:.
1831:.
1821:36
1819:.
1807:^
1793:.
1785:.
1773:.
1760:^
1746:.
1738:.
1730:.
1720:13
1718:.
1714:.
1688:.
1678:.
1670:.
1660:.
1650:17
1648:.
1644:.
1621:.
1613:.
1601:.
1578:.
1568:.
1560:.
1537:.
1527:.
1471:.
1463:.
1451:.
1398:.
1388:.
1380:.
1368:22
1366:.
1362:.
1339:.
1331:.
1321:41
1319:.
1315:.
1292:.
1278:.
1274:.
1251:.
1241:37
1239:.
1052:,
1044:,
1040:,
99:.
61:,
2600:.
2588::
2539:.
2515::
2507::
2480:.
2456::
2448::
2421:)
2392:.
2380::
2372::
2349:.
2327::
2308:.
2286::
2267:.
2247::
2221:.
2201::
2178:.
2158::
2096:.
2051:.
2031::
2025:2
2001:.
1989::
1966:.
1936::
1928::
1904:.
1874::
1847:.
1827::
1801:.
1781::
1775:7
1754:.
1726::
1696:.
1664::
1656::
1629:.
1609::
1603:9
1586:.
1564::
1545:.
1523::
1504:.
1479:.
1459::
1453:7
1406:.
1374::
1347:.
1327::
1300:.
1286::
1259:.
1247::
1005:.
1000:]
985:2
980:2
969:2
964:1
955:+
950:k
944:P
936:2
931:1
922:+
917:k
911:P
903:2
898:2
886:k
880:P
872:2
867:1
844:2
839:2
828:2
823:1
814:+
809:k
803:P
795:2
790:1
781:+
776:k
770:P
762:2
757:2
745:k
739:P
731:2
726:2
711:[
706:=
701:k
695:L
659:k
653:P
618:1
611:)
605:2
597:2
588:+
583:2
575:1
566:(
563:=
558:2
553:3
535:,
523:)
518:2
512:x
504:2
496:2
488:+
483:1
477:x
469:2
461:1
453:(
448:2
443:3
435:=
430:3
424:x
392:3
386:x
360:2
355:2
326:2
321:1
290:2
284:x
259:1
253:x
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.