Wireless technology has been regarded as a paradigm shifter in the process industry. The first open wireless communication standard specifically designed for process measurement and control applications, WirelessHART was officially released in September 2007 (as a part of the HART 7 Specification). WirelessHART is a secure and TDMA- based wireless mesh networking technology operating in the 2.4 GHz ISM radio band. In this paper, we give an introduction to the architecture of WirelessHART and share our first-hand experience in building a prototype for this specification. We describe several challenges we had to tackle during the implementation, such as the design of the timer, network wide synchronization, communication security, reliable mesh networking, and the central network manager. For each challenge, we provide a detailed analysis and propose our solution. Based on the prototype implementation, a simple WirelessHART network has been built for the purpose of demonstration. The demonstration network in turn validates our design. To the best of our knowledge, this is the first reported effort to build a WirelessHART protocol stack.
With the success of wireless technologies in consumer electronics, standard wireless technologies are envisioned for the deployment in industrial environments as well. Industrial applications involving mobile subsystems or just the desire to save cabling make wireless technologies attractive. Nevertheless, these applications often have stringent requirements on reliability and timing. In wired environments, timing and reliability are well catered for by fieldbus systems (which are a mature technology designed to enable communication between digital controllers and the sensors and actuators interfacing to a physical process). When wireless links are included, reliability and timing requirements are significantly more difficult to meet, due to the adverse properties of the radio channels. In this paper, we thus discuss some key issues coming up in wireless fieldbus and wireless industrial communication systems: 1) fundamental problems like achieving timely and reliable transmission despite channel errors; 2) the usage of existing wireless technologies for this specific field of applications; and 3) the creation of hybrid systems in which wireless stations are incorporated into existing wired systems.
Network testing plays an important role in the iterative process of developing new communication protocols and algorithms. However, test environments have to keep up with the evolution of technology and require continuous update and redesign. In this article, we propose COINS, a framework that can be used by wireless technology developers to enable CI practices in their testbed infrastructure. As a proof-of-concept, we provide a reference architecture and implementation of COINS for controlled testing of multi-technology 5G MTC networks. The implementation upgrades an existing wireless experimentation testbed with new software and hardware functionalities. It blends web service technology and operating system virtualization technologies with emerging Internet of Things technologies enabling CI for wireless networks. Moreover, we also extend an existing qualitative methodology for comparing similar frameworks and identify and discuss open challenges for wider use of CI practices in wireless technology development.
Bluetooth Low Energy (BLE) is an emerging low-power wireless technology developed for short-range control and monitoring applications that is expected to be incorporated into billions of devices in the next few years. This paper describes the main features of BLE, explores its potential applications, and investigates the impact of various critical parameters on its performance. BLE represents a trade-off between energy consumption, latency, piconet size, and throughput that mainly depends on parameters such as connInterval and connSlaveLatency. According to theoretical results, the lifetime of a BLE device powered by a coin cell battery ranges between 2.0 days and 14.1 years. The number of simultaneous slaves per master ranges between 2 and 5,917. The minimum latency for a master to obtain a sensor reading is 676 mu s, although simulation results show that, under high bit error rate, average latency increases by up to three orders of magnitude. The paper provides experimental results that complement the theoretical and simulation findings, and indicates implementation constraints that may reduce BLE performance.
This paper develops power control algorithms for energy efficiency (EE) maximization (measured in bit/Joule) in wireless networks. Unlike previous related works, minimum-rate constraints are imposed and the signal-to-interference-plus-noise ratio takes a more general expression, which allows one to encompass some of the most promising 5G candidate technologies. Both network-centric and user-centric EE maximizations are considered. In the network-centric scenario, the maximization of the global EE and the minimum EE of the network is performed. Unlike previous contributions, we develop centralized algorithms that are guaranteed to converge, with affordable computational complexity, to a Karush-Kuhn-Tucker point of the considered non-convex optimization problems. Moreover, closed-form feasibility conditions are derived. In the user-centric scenario, game theory is used to study the equilibria of the network and to derive convergent power control algorithms, which can be implemented in a fully decentralized fashion. Both scenarios above are studied under the assumption that single or multiple resource blocks are employed for data transmission. Numerical results assess the performance of the proposed solutions, analyzing the impact of minimum-rate constraints, and comparing the network-centric and user-centric approaches.
Research on 5G mobile wireless technologies has been very active in both academia and industry in the past few years. While there has been certain consensus on the overall requirements of 5G wireless systems (e.g., in data rate, network capacity, delay), various enabling wireless technologies have been considered and studied to achieve these performance targets. It has been quite clear, however, that there would be no single enabling technology that can achieve all diverse and even conflicting 5G requirements. In general, many fundamental changes and innovations to re-engineer the overall network architecture and algorithms in different layers and to exploit new system degrees of freedom would be needed for the future 5G wireless system. In particular, we may need to consider other potential waveform candidates that can overcome limitations of the orthogonal frequency multiple access (OFDM) waveform employed in the current 4G system, develop disruptive technologies to fulfill 5G rate and capacity requirements including network densification, employment of large-scale (massive) multiple input multiple output (MIMO), and exploitation of the millimeter wave (mmWave) spectrum to attain Gigabit communications. In addition, design tools from the computer networking domain including software defined networking, virtualization, and cloud computing are expected to play important roles in defining the more flexible, intelligent, and efficient 5G network architecture. This paper aims at describing key 5G enabling wireless mobile technologies and discussing their potentials and open research challenges. We also present how papers published in our special issue contribute to the developments of these disruptive 5G technologies.
In this paper, we present an industrial development of a wireless sensor network technology called OCARI: optimization of communication for ad hoc reliable industrial networks. It targets applications in harsh environments such as power plants and warships. OCARI is a wireless-communication technology that supports mesh topology and power-aware ad hoc routing protocol aimed at maximizing the network lifetime. It is based on IEEE 802.15.4 physical layer with deterministic media access control layer for time-constrained communication. During the nontime-constrained communication period, its ad hoc routing strategy uses an energy-aware optimized-link state-routing proactive protocol. An OCARI application layer (APL) is based on ZigBee application support sublayer and APL primitives and profiles to provide maximum compatibility with ZigBee applications. To fully assess this technology, extensive tests are done in industrial facilities at ElectricitEacute De France R&D as well as at Direction des Constructions Navales Services. Our objective is then to promote this specification as an open standard of industrial wireless technology.
Purpose Hospitalization and surgery in older patients often leads to a loss of strength, mobility, and functional capacity. We tested the hypothesis that wireless accelerometry could be used to measure mobility during hospital recovery after cardiac surgery. Description We used an off-the-shelf fitness monitor to measure daily mobility in patients after surgery. Data were transmitted wirelessly, aggregated, and configured onto a provider-viewable dashboard. Evaluation Wireless monitoring of mobility after major surgery was easy and practical. There was a significant relationship between the number of steps taken in the early recovery period, length of stay, and dismissal disposition. Conclusions Wireless monitoring of mobility after major surgery creates an opportunity for early identification and intervention in individual patients and could serve as a tool to evaluate and improve the process of care and to affect postdischarge outcomes.