NEC orchestrating a brighter world
NEC Laboratories Europe

Information and Communication - Technologies for a better tomorrow

At NEC Laboratories Europe GmbH, we invent and collaborate to deliver solutions to our society’s greatest challenges. We push the boundaries of AI, IoT, blockchain and 5G technologies through original contributions that are published in the top conferences and journals and thereby deliver new value to NEC’s global business. Read more

 

 

Research Areas

Blog: Recent Highlights from our Research

 scheme2.pdf-1.png blog_preview_data-science_edge_1.png

Inferring Dependency Structures for Relational Learning

Graph neural networks (GNNs) are a popular class of machine learning models whose major advantage is their ability to incorporate a sparse and discrete dependency structure between data points. Unfortunately, GNNs can only be used when such a graph-structure is available. In practice, however, real-world graphs are often noisy and incomplete or might not be available at all. With this work, we propose to jointly learn the graph structure and the parameters of graph convolutional networks (GCNs) by approximately solving a bilevel program that learns a discrete probability distribution on the edges of the graph. This allows one to apply GCNs not only in scenarios where the given graph is incomplete or corrupted but also in those where a graph is not available. We conduct a series of experiments that analyze the behavior of the proposed method and demonstrate that it outperforms related methods by a significant margin.

 bison_test.jpg blog_preview_data-science_edge_2.png

Attending to Future Tokens for Bidirectional Sequence Generation

Accepted at Empirical Methods for Natural Language Processing (EMNLP) 2019 NLP experienced a major change in the previous months. Previously, each NLP task defined a neural model and trained this model on the given task. But in recent months, various papers (ELMo [1], ULMFiT [2], GPT [3], BERT [4], GPT2 [5]) showed that it is possible to pre-train a NLP model on a language modelling task (more on this below) and then use this model as a starting point to fine-tune to further tasks. This has been labelled as an important turning point for NLP by many ([6], [7], [8], inter alia).

 provisionT2.pdf-1.png blog_preview_security_edge_1.png

Trusted Execution Environment-based Applications in the Cloud

With the proliferation of Trusted Execution Environments (TEEs) such as Intel SGX, a number of cloud providers will soon introduce TEE capabilities within their offering (e.g., Microsoft Azure). The integration of SGX within the cloud considerably strengthens the threat model for cloud applications. However, cloud deployments depend on the ability of the cloud operator to add and remove application dynamically; this is no longer possible given the current model to deploy and provision enclaves that actively involves the application owner. In this paper, we propose ReplicaTEE, a solution that enables seamless commissioning and decommissioning of TEE-based applications in the cloud. ReplicaTEE leverages an SGX-based provisioning service that interfaces with a Byzantine Fault-Tolerant storage service to securely orchestrate enclave replication in the cloud, without the active intervention of the application owner. Namely, in ReplicaTEE, the application owner entrusts application secret to the provisioning service; the latter handles all enclave commissioning and decommissioning operations throughout the application lifetime. We analyze the security of ReplicaTEE and show that it is secure against attacks by a powerful adversary that can compromise a large fraction of the cloud infrastructure. We implement a prototype of ReplicaTEE in a realistic cloud environment and evaluate its performance. ReplicaTEE moderately increments the TCB by ≈ 800 LoC. Our evaluation shows that ReplicaTEE does not add significant overhead to existing SGX-based applications.

 CCMTL_concept.pdf-1.png blog_preview_data-science_edge_2.png

Multi-task Learning for Massive Numbers of Regression Tasks

Many real-world large-scale regression problems can be formulated as Multi-task Learning (MTL) problems with a massive number of tasks, as in retail and transportation domains. However, existing MTL methods still fail to offer both the generalization performance and the scalability for such problems. Scaling up MTL methods to problems with a tremendous number of tasks is a big challenge. Here, we propose a novel algorithm, named Convex Clustering Multi-Task regression Learning (CCMTL), which integrates with convex clustering on the k -nearest neighbor graph of the prediction models. Further, CCMTL efficiently solves the underlying convex problem with a newly proposed optimization method. CCMTL is accurate, efficient to train, and empirically scales linearly in the number of tasks. On both synthetic and real-world datasets, the proposed CCMTL outperforms seven state-of-the-art (SoA) multi-task learning methods in terms of prediction accuracy as well as computational efficiency. On a real-world retail dataset with 23 , 812 tasks, CCMTL requires only around 30 seconds to train on a single thread, while the SoA methods need up to hours or even days.

We are a community of thinkers. Our teams are innovating at the cutting edge of their fields. Join a brilliant team of researchers working to solve technology’s most exciting challenges.

NLE provides a dynamic environment for research careers in a wide variety of disciplines, including machine learning, data science, security, system platform, IoT and 5G. Our researchers and engineers work in small teams in an informal setting. We provide a challenging and nurturing research environment in the company of our renowned scientists and worldwide collaborations.

Top of this page