Tous les Focus du Mois / All month's focus
runner-up (second place) of the Application Impact award at IJCAI 2019
Political debates are the means used by political candidates to put forward and justify their positions in front of the electors with respect to the issues at stake. Argument mining is a novel research area in Artificial Intelligence, aiming at analyzing discourse on the pragmatics level and applying a certain argumentation theory to model and automatically analyze textual data. In this paper, the authors present DISPUTool, a tool designed to ease the work of historians and social science scholars in analyzing the argumentative content of political speeches. More precisely, DISPUTool allows to explore and automatically identify argumentative components over the 39 political debates from the last 50 years of US presidential campaigns (1960-2016).
N. Isoart and J.-C. Régin: Integration of Structural Constraints into TSP Models. 25th International Conference of Constraint Programming. LNCS, vol. 11802, pp. 284-299, Connecticut, USA 2019. Springer (2019)
Several models based on constraint programming have been proposed to solve the traveling salesman problem (TSP). The most efficient ones, such as the weighted circuit constraint (WCC), mainly rely on the Lagrangian relaxation of the TSP, based on the search for spanning tree or more precisely "1-tree". The weakness of these approaches is that they do not include enough structural constraints and are based almost exclusively on edge costs. The purpose of this paper is to correct this drawback by introducing the Hamiltonian cycle constraint associated with propagators. We propose some properties preventing the existence of a Hamiltonian cycle in a graph or, conversely, properties requiring that certain edges be in the TSP solution set. Notably, The authors design a propagator based on the research of k-cutsets. The combination of this constraint with the WCC constraint allows us to obtain, for the resolution of the TSP, gains of an order of magnitude for the number of backtracks as well as a strong reduction of the computation time.
Alessio Pagliari, Fabrice Huet, Guillaume Urvoy-Keller. On the Cost of Acking in Data Stream Processing Systems. 2019 19th IEEE/ACM International Symposium on Cluster, Cloud, and Grid Computing (CCGRID)
The widespread use of social networks and applications such as IoT networks generates a continuous stream of data that companies and researchers want to process, ideally in real-time. Data stream processing systems (DSP) enable such continuous data analysis by implementing the set of operations to be performed on the stream as directed acyclic graph (DAG) of tasks. While these DSP systems embed mechanisms to ensure fault tolerance and message reliability, only few studies focus on the impact of these mechanisms on the performance of applications at runtime. In this paper, the authors demonstrate the impact of the message reliability mechanism on the performance of the application. They use an experimental approach, using the Storm middleware, to study an acknowledgment-based framework. They compare the two standard schedulers available in Storm with applications of various degrees of parallelism, over single and multi cluster scenarios. The authors show that the acking layer may create an unforeseen bottleneck due to the acking tasks placement; a problem which, to the best of their knowledge, has been overlooked in the scientific and technical literature. The authors propose two strategies for improving the acking tasks placement and demonstrate their benefit in terms of throughput and latency.
Respiratory Waveform Estimation From Multiple Accelerometers: An Optimal Sensor Number and Placement Analysis
A. Siqueira, A. F. Spirandeli, R. Moraes and V. Zarzoso, "Respiratory Waveform Estimation From Multiple Accelerometers: An Optimal Sensor Number and Placement Analysis," in IEEE Journal of Biomedical and Health Informatics, vol. 23, no. 4, pp. 1507-1515, July 2019.
Respiratory patterns are commonly measured to monitor and diagnose cardiovascular, metabolic, and sleep disorders. Electronic devices such as masks used to record respiratory waveforms usually require medical staff support and obstruct the patients' breathing, causing discomfort. New techniques are being investigated to overcome such limitations. An emerging approach involves accelerometers to estimate the respiratory waveform based on chest motion. However, most of the existing techniques employ a single accelerometer placed on an arbitrary thorax position. The present work investigates the use and optimal placement of multiple accelerometers located on the thorax and the abdomen. The study population is composed of 30 healthy volunteers in three different postures. By means of a custom-made microcontrolled system, data are acquired from an array of ten accelerometers located on predefined positions and a pneumotachograph used as reference. The best sensor locations are identified by optimal linear reconstruction of the reference waveform from the accelerometer data in the minimum mean square error sense. The analysis shows that right-hand side locations contribute more often to optimal respiratory waveform estimates, a sound finding given that the right lung has a larger volume than the left lung. In addition, the authors show that the respiratory waveform can be blindly extracted from the recorded accelerometer data by means of independent component analysis. In conclusion, linear processing of multiple accelerometers in optimal positions can successfully recover respiratory information in clinical settings, where the use of masks may be contraindicated.
Alberto Dennunzio, Enrico Formenti, Luciano Margara, Valentin Montmirail, Sara Riva
Boolean automata networks, genetic regulation networks, and metabolic net- works are just a few examples of biological modelling by discrete dynamical systems (DDS). A major issue in modelling is the verification of the model against the exper- imental data or inducing the model under uncertainties in the data. Equipping finite discrete dynamical systems with an algebraic structure of commutative semiring pro- vides a suitable context for hypothesis verification on the dynamics of DDS. Indeed, hypothesis on the systems can be translated into polynomial equations over DDS. Solu- tions to these equations provide the validation to the initial hypothesis. Unfortunately, finding solutions to general equations over DDS is undecidable. In this article, the authors want to push the envelop further by proposing a practical approach for some decidable cases in a suitable configuration that the authors call the Hypothesis Checking. the authors demonstrate that for many decidable equations all boils down to a “simpler” equation. However, the problem is not to decide if the simple equation has a solution, but to enumerate all the solutions in order to verify the hypothesis on the real and undecidable systems. We evaluate experimentally our approach and show that it has good scalability properties.
Lucile Sassatelli, Marco Winckler, Thomas Fisichella, Ramon Aparicio-Pardo, Anne-Marie Dery-Pinna
29th Workshop on Network and Operating System Support for Digital Audio and Video (NOSSDAV'19)
Despite exciting prospects, the development of 360° videos is persistently hindered by the difficulty to stream them. To reduce the data rate, existing streaming strategies adapt the video rate to the user's Field of View (FoV), but the difficulty of predicting the FoV and persistent lack of bandwidth are important obstacles to achieve best experience. In this article, the authors exploit the recent findings on human attention in VR to introduce a new additional degree of freedom for the streaming algorithm to leverage: Virtuall Walls (VWs) are designed to translate bandwidth limitation into a new type of impairment allowing to preserve the visual quality by subtly limiting the user's freedom in well-chosen periods. The authors carry out experiments with 18 users and confirm that, if the VW is positioned after the exploration phase in scenes with concentrated saliency, a substantial fraction of users seldom perceive it. With a double-stimulus approach, the authors show that, compared with a reference with no VW consuming the same amount of data, VW can improve the quality of experience. Simulation of different FoV-based streaming adaptations with and without VW show that VW enables reduction in stalls and increases quality in FoV.
Sami Lazreg, Maxime Cordy, Philippe Collet, Patrick Heymans, Sébastien Mosser. Multifaceted Automated Analyses for Variability-Intensive Embedded Systems. The International Conference on Software Engineering (ICSE) (ACM/IEEE). Montréal, QC, Canada: 25 May - 31 May 2019!.
Embedded systems, as the ones found in the automotive domain, must comply with stringent functional and non-functional requirements. To fulfil these requirements, engineers are confronted with a plethora of design alternatives both at the software and hardware level, out of which they must select the optimal solution wrt. possibly-antagonistic quality attributes (e.g. cost of manufacturing vs. speed of execution). We propose a formal model-driven framework to assist engineers in this choice. It captures high-level specifications of the system alternatives in the form of dataflows with variability and configurable hardware platforms. A mapping algorithm then derives the design space, i.e. the set of compatible pairs of application and platform variants, and a variability-aware executable model, which encodes the functional and non-functional behaviour of all viable system variants. Novel verification algorithms then pinpoint the optimal system variants efficiently. The benefits of our approach are evaluated through an industrial case study.
Maria João Rendas, Asya Metelkina et Luc Pronzato
L’utilisation de produits dopants dans le cadre de la pratique sportive est un problème important, qui dépasse aujourd’hui largement le cadre des compétitions sportives de haut niveau, et dont la détection efficace doit reposer sur des outils de dépistage automatisés, permettant l’identification rapide et à faible coût des athlètes possiblement dopés.
La fédération mondiale de référence IAAF (International Association of Athletics Federations, Monaco) et l’I3S développent depuis avril 2017 une collaboration sur la modélisation mathématique de performances d’athlètes de haut niveau. C’est la première fois que l’IAAF collabore avec un organisme de recherche français pour le traitement de ses données dans ce cadre. Le but de cette collaboration (I3S, équipe SIS : Dr Metelkina, Pr Rendas, Pr Pronzato – IAAF : Dr Bermon et Dr Garnier) est la caractérisation probabiliste de l’évolution de la performance d’athlètes tout le long de leur carrière.
L’ABP (Athlete’s Biological Passport) basé sur des indicateurs hématologiques obtenus lors de prélèvements effectués selon des protocoles stricts permet aujourd’hui un triage fiable de l’ensemble d’athlètes suivis. Cependant, les coûts associés à l’ABP limitent le nombre d’athlètes suivis ainsi que la fréquence des prélèvements auxquels chaque athlète est soumis. En utilisant le modèle probabiliste pour signaler l’éloignement des performances d’un athlète par rapport à des évolutions considérées comme normales, cette collaboration doit conduire d’une part à un système de détection précoce de cas de dopage capable de surveiller d’une façon continue un grand nombre d’athlètes, et d’autre part à une optimisation des plans de tests anti dopage.
La comparaison de deux approches complémentaires de modélisation (paramétrique versus non paramétrique) visant à représenter les performances d’un ensemble d’athlètes masculins sur des épreuves d‘athlétisme (400, 800, 1500, 5000 et 10000 mètres) sont en cours de publication.
Les figures montrent deux carrières détectées comme anormales par la méthode non paramétrique. Le trait bleu épais correspond à la trajectoire de l’athlète, et les autres courbes à un ensemble de trajectoires « typiques ». La figure ci-dessous correspond à un recordman sur 400 mètres, avec des temps bien inférieurs à la population d’athlètes analysée, tandis que la première figure (ci-dessus) correspond au cas d’un athlète dont les performances sur 400 mètres s’améliorent (de manière atypique et douteuse) en fin de carrière.
Joëlle Despeyroux, Amy Felty, Pietro Lio and Carlos Olarte.
Data streams for a personalised breast cancer programme could include collections of image data, tumour genome sequencing, likely at the single cell level, and liquid biopsies (DNA and Circulating Tumour Cells (CTCs)). Although they are rich in information, the full power of these datasets will not be realised until we develop methods to model the cancer systems and conduct analyses that transect these streams. In addition to machine learning approaches, we believe that logical reasoning has the potential to provide help in clinical decision support systems for doctors. The autors develop a logical approach to modelling cancer progression, focusing on mutation analysis and CTCs, which include the appearance of driver mutations, the transformation of normal cells to cancer cells in the breast, their circulation in the blood, and their path to the bone.
Long term goal is to improve the prediction of survival of metastatic breast cancer patients. The autors model the behaviour of the CTCs as a transition system, and we use Linear Logic (LL) to reason about our model. The autors consider several important properties about CTCs and prove them in LL. In addition, The autors formalise our results in the Coq Proof Assistant, thus providing formal proofs of their model. The autors believe that our results provide a promising proof-of-principle and can be generalised to other
cancer types and groups of driver mutations.
A little bit of green in networks and other problems of placement and management of resources.
In this thesis (HDR / habilitation degree), Frédéric Giroire presents a set of solutions to optimize network infrastructures. Pushed by the new sensitivity of the society, politics, and companies to energy costs and global warming, Frédéric investigated the question of how to build green networks. Frédéric first studied some practical scenarios to answer the question: how much energy could be saved for Internet Service Providers by putting into practice energy efficient protocols? It led Frédéric to study fundamental problems of graph theory.
At the core of these energy efficient methods, there is a dynamic adaptation to the changes of demands, which is impossible to do in legacy networks which are mostly manually operated. The emergence of two new paradigms, software defined networking (SDN) and network function virtualization (NFV), leads to a finer control of networks and thus bears the promise to to put energy efficient solutions into practice. Frédéric thus studied how to use SDN to implement dynamic routing.
Frédéric’s approach has been to use theoretical tools to solve problems raised by the introduction of new technologies or new applications. Frédéric’s tools come mainly from combinatorics and in particular from graph theory, algorithmics, optimization and probabilities. When Frédéric was able to propose new methods of resolution, Frédéric then tried to evaluate their practical impact by numerical evaluation, simulation or experimentation with realistic scenarios.
More about this thesis: let's consult the manuscript and the slides of Frédéric’s defense
Frank De Boer, Vlad Serbanescu, Reiner Hähnle, Ludovic Henrio, Justine Rochas, et al.. A Survey of Active Object Languages. ACM Computing Surveys, Association for Computing Machinery, 2017, 50 (5), pp.1 - 39.
To program parallel systems efficiently and easily, a wide range of programming models have been proposed, each with different choices concerning synchronization and communication between parallel entities. Among them, the actor model is based on loosely coupled parallel entities that communicate by means of asynchronous messages and mailboxes. Some actor languages provide a strong integration with object-oriented concepts; these are often called active object languages. This paper reviews four major actor and active object languages and compares them according to carefully chosen dimensions that cover central aspects of the programming paradigms and their implementation.