Journal: TIK-Schriftenreihe
Loading...
Abbreviation
Publisher
ETH Zurich, Computer Engineering and Networks Laboratory
43 results
Search Results
Publications1 - 10 of 43
- Security econometricsItem type: Doctoral Thesis
TIK-SchriftenreiheFrei, Stefan (2009) - Dynamic Protocol StacksItem type: Doctoral Thesis
TIK-SchriftenreiheTrammell-Keller, Ariane (2014) - On Efficient Data Exchange in Multicore ArchitecturesItem type: Doctoral Thesis
TIK-SchriftenreiheTretter, Andreas (2018)In contemporary multicore architectures, three trends can be observed: (i) A growing number of cores, (ii) shared memory as the primary means of communication and data exchange and (iii) high diversity between platform architectures. Still, these platforms are typically programmed manually on a core-by-core basis; the most helpful tool that is widely accepted are library implementations of frequently used algorithms. This complicated task of multicore programming will grow further in complexity with the increasing numbers of cores. In addition, the constant change in architecture designs and thus in platform-specific programming demands will continue to make it laborious to migrate existing code to new platforms. State-of-the-art methods of automatic multicore code generation only partially meet the requirements of modern multicore platforms. They typically have a high overhead for different threads when growing numbers of cores and thus shrinking thread granularities demand the opposite. Also, they typically use message passing models for implementing data exchange when memory sharing should be the natural mode of data exchange. As a result, they often fail to produce efficient code, especially when large data throughput is required. This thesis proposes a data-oriented approach to multicore programming. It shows how dividing a program into discrete tasks with clearly specified inputs and outputs helps to formalise the problem of optimising high data throughput applications for a large range of multicore architectures, at the same time enabling an efficient, low-overhead implementation. In detail, its contributions are as follows. * Inefficiencies in existing programming models are demonstrated for the cases of the CAL actor language and Kahn process networks. Methods are shown to reduce these inefficiencies. * Ladybirds, a specification model and language for parallel programs is presented. A Ladybirds program consists of a tasks with clearly defined inputs and outputs and of dependencies between them. It is explained how Ladybirds aims at execution efficiency also in the domains of data placement and transport and what steps are necessary to get from a Ladybirds specification to executable program code. The examples of comfortable debugging and of minimising state retention overhead for transient systems underline the usability and versatility of Ladybirds. * An optimisation method for Ladybirds programs on the Kalray MPPA platform is presented. It tries to place data on different memory banks such as to avoid access conflicts. Afterwards, the Ladybirds optimisation problem for the general case of arbitrary target platforms is formalised. Different aspects of it are discussed in greater detail and requirements for particular target platforms are examined. * Also, a better understanding of contemporary hardware is sought. For that purpose, different probabilistic descriptions and models for interleaved on-chip memory are proposed and evaluated. - Scalable flow control for interconnection networksItem type: Doctoral Thesis
TIK-SchriftenreiheGramsamer, Ferdinand (2003) - Toward Structured and Time-Constraint Content Delivery SystemsItem type: Doctoral Thesis
TIK-SchriftenreiheMeier, Remo (2011) - The architecture of an interactive multimedia communication systemItem type: Doctoral Thesis
TIK-SchriftenreiheRöthlisberger, Urs (1998) - Leveraging Synchronous Transmissions for the Design of Real-time Wireless Cyber-Physical SystemsItem type: Doctoral Thesis
TIK-SchriftenreiheJacob, Romain (2020)Cyber-Physical Systems (CPS) refer to systems where some intelligence is embedded into devices that interact with their environment; that is, collecting information from the physical space, processing that information, and taking actions that affect the environment. Automatically turning the heating on when room temperature gets cold is one of the simplest example of CPS. Things get more complex when applications are distributed between low-power devices that should operate autonomously for multiple years. Then, performing reliable and energy efficient wireless communication becomes paramount. Moreover, applications often specify deadlines; that is, maximal tolerable delays between the execution of distributed tasks. Systems that guarantee to meet such deadlines are called real-time systems. Wireless CPS capable of providing real-time guarantees while using low-power communication technology are desirable but they are particularly challenging to design. In the past few years, a technique known as synchronous transmissions (ST) has been shown to enable reliable and energy efficient communication in low-power multi-hop networks. In a nutshell, ST consists in letting multiple devices transmit a packet during the same time interval; communication is likely to be successful if the transmissions are well synchronized, hence the name of synchronous transmissions. ST can be leveraged to realize any multi-hop broadcast – a one-to-all communication – in a given time; a very interesting property for designing real-time systems. While the potential of ST is recognized by the low-power wireless academic community, this technique has not yet been leveraged for the design of CPS. We identify at least three issues that limit the adoption of ST in this domain: (i) ST is difficult to use due to stringent time synchronization requirements: in the order of μs. There is a lack of tools to facilitate the implementation of ST by CPS engineers, which are often not wireless communication experts. (ii) There are only few examples showcasing the use of ST for CPS applications and academic works based on ST tend to focus on communication rather than applications. Convincing proof-of-concept CPS applications are missing. (iii) The inherent variability of the wireless environment makes performance evaluation challenging. The lack of an agreed-upon methodology hinders experiment reproduciblility and limits the confidence in the performance claims. Consequently, we developed support tools and methods to facilitate the evaluation of wireless protocols and the implementation of CPS based on ST. Furthermore, we leveraged ST to design two CPS solutions targeting different classes of real-time applications. This dissertation presents these contributions. In Chapter 2, we propose to design and analyze performance evaluation experiments for networking protocols using a concrete, rational, and statistically sound methodology. We implement this methodology in a framework called TriScale which allows to make performance claims with quantifiable levels of confidence. Furthermore, we leverage the TriScale framework to propose the first formalized definition of reproducibility for networking experiments. Chapter 3 presents Baloo, a flexible design framework for network stacks based on ST. Users implement their protocol through the programming interface offered by Baloo while the framework handles the complex low- level operations; e.g., meeting the time synchronization requirements of ST. We show that Baloo is flexible enough to implement a wide variety of commu- nication protocols while introducing only limited memory and energy overhead. Finally, we design and implement two wireless CPS based on ST: – the Distributed Real-time Protocol (DRP) uses contracts to maximize the flexibility of execution between distributed tasks (Chapter 4); – Time-Triggered Wireless (TTW ) statically co-schedules all task executions and packet transfers to minimize end-to-end latency (Chapter 5). We demonstrate that real-time guarantees can be provided in a reliable and energy efficient manner. Furthermore, TTW supports update rates of tens of ms, which is sufficient to perform distributed closed-loop control of inverted pendulums – a fundamental benchmark for control and robotic applications. With this dissertation, we showcase that ST is suitable to meet the requirements of real-time wireless CPS. Furthermore, we facilitate the implementation of such systems with Baloo, a design framework that makes ST accessible to the non-expert. Finally, TriScale provides an important building block to confidently evaluate the performance of networking protocols – an essential building block of wireless CPS. Building on TriScale, it would be useful to define benchmark problems representative of different classes of applications to serve as baseline for the evaluation of future wireless CPS solutions. Ultimately, we must transition from proof-of-concepts to real-world wireless CPS applications; this would be further facilitated by porting Baloo to newer and more powerful platforms, thereby pushing the limits of achievable performance levels. - Harnessing Environmental Data at the Edge of the CloudItem type: Doctoral Thesis
TIK-SchriftenreiheMeyer, Matthias (2021)Global warming is a defining challenge of our time with devastating consequences for local habitats. High mountain areas are particularly affected by global warming leading to a decline of their cryosphere (glaciers, snow cover and permafrost). In high-alpine steep bedrock, permafrost thaw decreases the stability of mountain slopes leading to an increase of rockfalls and landslides and thereby putting life and built infrastructure at risk. Monitoring these environmental changes is important for natural hazard warning and understanding the geophysical processes leading to such hazards. Moreover, by providing evidence from large-scale, long-term measurements, environmental monitoring helps to bolster scientific findings and can call attention to the immediate impacts of climate change. The rise of wireless sensor networks offers a range of possibilities for environmental monitoring enabling large-scale deployments with high spatial-temporal resolution using many different sensor types. The cheap and diverse sensors can be installed at hard to reach places with little available networking or power infrastructure. However, the resulting datasets (often heterogenous and long-term measurements) require a complex data analysis. Moreover, networking or power failures often lead to an error-prone data collection and a fragmented and noisy datasets. Analyzing these datasets typically requires dedicated domain-expert knowledge which can not be scaled to long-term monitoring datasets. Machine learning provides options to extract information automatically but these techniques usually require a clean dataset for training and their performance is strongly affected by differences in the distribution of training and test data. In this dissertation, we consequently develop tools and methods applicable to heterogeneous, long-term, noisy datasets originating in wireless sensor network deployments. The main contributions of the dissertation are - A methodology to work with fragmented and noisy data from a real-world sensor network deployment at Matterhorn, Switzerland. The methodology uses active learning with human-in-the-loop and a heterogeneous set of sensors to systematically filter out unwanted influences from seismic signals. - The development and installation of an array of low-power, event-triggered micro-seismic sensors for the purpose of rockfall early warning. In addition, a machine-learning based human footstep classifier is designed and optimized for computation on memory-constraint embedded devices to detect humans in the hazard zone. - Unsupervised and semi-supervised methods designed to bridge machine learning technology and domain-expert knowledge by providing experts with automated information extraction and machine-learning algorithms with crucial information such as information about the system context. - foReal, a data analytics and visualization platform which allows to combine data from different sources. It is designed for long-term and large-scale environmental datasets and focuses on robustness against data corruption, missing data and misconfigurations during data processing as well as misinterpretations during experiment design and analysis. The tooling developed enables fast and easy exchange between experts of various domains and offers the public access to scientific data. - Towards Guarantees for Adaptive Embedded Systems in Uncertain EnvironmentsItem type: Doctoral Thesis
TIK-SchriftenreiheDraskovic, Stefan (2021) - Mastering imperfect and partial information in wireless sensing systemsItem type: Doctoral Thesis
TIK-SchriftenreiheKeller, Matthias (2013)
Publications1 - 10 of 43