All Work
Title
Topic
-
‘Threshold ECDSA in Three Rounds’
” “We present a three-round protocol for threshold ECDSA signing with malicious security against a dishonest majority, which information-theoretically UC-realizes a standard threshold signing functionality, assuming ideal commitment and two-party multiplication primitives. Our work improves upon and fully subsumes the DKLs t-of-n and 2-of-n protocols. This document focuses on providing a succinct but complete description of the protocol and its security proof, and contains little expository text.” Find the paper and the full list of authors at the Cryptology ePrint Archive.
-
‘Threshold BBS+ Signatures for Distributed Anonymous Credential Issuance’
“We propose a secure multiparty signing protocol for the BBS+ signature scheme; in other words, an anonymous credential scheme with threshold issuance. We prove that due to the structure of the BBS+ signature, simply verifying the signature produced by an otherwise semi-honest protocol is sufficient to achieve composable security against a malicious adversary. Consequently, our protocol is extremely simple and efficient: it involves a single request from the client … to the signing parties, two exchanges of messages among the signing parties, and finally a response to the client.” Find the paper and full list of authors at Cryptology ePrint…
-
‘JaX: Detecting and Cancelling High-Power Jammers Using Convolutional Neural Network’
“We present JaX, a novel approach for detecting and cancelling high-power jammers in the scenarios when the traditional spread spectrum techniques and other jammer avoidance approaches are not sufficient. JaX does not require explicit probes, sounding, training sequences, channel estimation, or the cooperation of the transmitter. We identify and address multiple challenges, resulting in a convolutional neural network for a multi-antenna system to infer the existence of interference, the number of interfering emissions and their respective phases.” Find the paper and full list of authors in the 16th ACM Conference on Security and Privacy in Wireless and Mobile Networks proceedings.
-
‘A Formal Analysis of Karn’s Algorithm’
“The stability of the Internet relies on timeouts. The timeout value, known as the Retransmission TimeOut (RTO), is constantly updated, based on sampling the Round Trip Time (RTT) of each packet as measured by its sender – that is, the time between when the sender transmits a packet and receives a corresponding acknowledgement. Many of the Internet protocols compute those samples via the same sampling mechanism, known as Karn’s Algorithm. We present a formal description of the algorithm, and study its properties.” Find the paper and the full list of authors in Networked Systems.
-
‘Evaluating the Impact of Community Oversight for Managing Mobile Privacy and Security’
“Mobile privacy and security can be a collaborative process where individuals seek advice and help from their trusted communities. To support such collective privacy and security management, we developed a mobile app for Community Oversight of Privacy and Security (“CO-oPS”) that allows community members to review one another’s apps installed and permissions granted to provide feedback. … Measures of transparency, trust, and awareness of one another’s mobile privacy and security behaviors, along with individual and community participation in mobile privacy and security co-management, increased from pre- to post-study.” Find the paper and the full list of authors at ArXiv.
-
‘Disaster World: Decision-Theoretic Agents for Simulating Population Responses to Hurricanes’
“Artificial intelligence (AI) research provides a rich source of modeling languages capable of generating socially plausible simulations of human behavior, while also providing a transparent ground truth that can support validation of social-science methods applied to that simulation. In this work, we leverage two established AI representations: decision-theoretic planning and recursive modeling. … We used PsychSim, a multiagent social-simulation framework combining these two AI frameworks, to build a general parameterized model of human behavior during disaster response, grounding the model in social-psychological theories to ensure social plausibility.” Find the paper and the full list of authors at Computational and Mathematical Organization…
-
‘Effectiveness of Teamwork-Level Interventions Through Decision-Theoretic Reasoning in a Minecraft Search-and-Rescue Task’
“Autonomous agents offer the promise of improved human teamwork through automated assessment and assistance during task performance. Studies of human teamwork have identified various processes that underlie joint task performance, while abstracting away the specifics of the task.We present here an agent that focuses exclusively on teamwork-level variables in deciding what interventions to use in assisting a human team. Our agent … relies on input from analytic components (ACs) (developed by other research teams) that process environmental information and output only teamwork-relevant measures.” Find the paper and authors list in the 2023 International Conference on Autonomous Agents and Multiagent Systems…
-
‘Agent-Based Modeling of Human Decision-Makers Under Uncertain Information During Supply Chain Shortages’
“In recent years, product shortages caused by supply chain disruptions have generated problems for consumers worldwide. … Understanding how humans learn to interpret information from others and how it influences their decision-making is key to alleviating supply chain shortages. In this work, we investigated how downstream supply chain echelons, health centers in pharmaceutical supply chains, interpret and use manufacturers’ estimated resupply date (ERD) information during drug shortages.” Find the paper and the full list of authors in the 2023 International Conference on Autonomous Agents and Multiagent Systems proceedings.
-
‘Scaling Up and Stabilizing Differentiable Planning with Implicit Differentiation’
“Differentiable planning promises end-to-end differentiability and adaptivity. However, an issue prevents it from scaling up to larger-scale problems: they need to differentiate through forward iteration layers to compute gradients, which couples forward computation and backpropagation and needs to balance forward planner performance and computational cost of the backward pass. … We propose to differentiate through the Bellman fixed-point equation to decouple forward and backward passes for Value Iteration Network and its variants, which enables constant backward cost (in planning horizon) and flexible forward budget and helps scale up to large tasks.” Find the paper and full list of authors at Open…
-
‘When Fair Classification Meets Noisy Protected Attributes’
“The operationalization of algorithmic fairness comes with several practical challenges, … [including] the availability or reliability of protected attributes in datasets. In real-world contexts, practical and legal impediments may prevent the collection and use of demographic data, making it difficult to ensure algorithmic fairness. … recent proposals aim to achieve algorithmic fairness in classification by incorporating noisiness in protected attributes or not using protected attributes at all. … Our study reveals that attribute-blind and noise-tolerant fair classifiers can potentially achieve similar level of performance as attribute-reliant algorithms, even when protected attributes are noisy.” Find the paper and full list of…
-
‘Malicious Selling Strategies in Livestream E-Commerce: A Case Study of Alibaba’s Taobao and ByteDance’s TikTok’
“We sought to explore streamers’ malicious selling strategies and understand how viewers perceive these strategies. First, we recorded 40 livestream shopping sessions from two popular livestream platforms in China—Taobao and TikTok. We identified 16 malicious selling strategies that were used to deceive, coerce, or manipulate viewers and found that platform designs enhanced nine of the malicious selling strategies. Second, through an interview study with 13 viewers, we report three challenges of overcoming malicious selling in relation to imbalanced power between viewers, streamers, and the platforms.” Find the paper and full list of authors at ACM Transactions on Computer-Human Interactions.
-
‘Towards Unbiased Exploration in Partial Label Learning’
“We consider learning a probabilistic classifier from partially-labelled supervision (inputs denoted with multiple possibilities) using standard neural architectures with a softmax as the final layer. We identify a bias phenomenon that can arise from the softmax layer in even simple architectures that prevents proper exploration of alternative options, making the dynamics of gradient descent overly sensitive to initialisation. We introduce a novel loss function that allows for unbiased exploration within the space of alternative outputs.” Find the paper and the full list of authors at ArXiv.
-
‘Improved Learning-Augmented Algorithms for k-Means and k-Medians Clustering’
“We consider the problem of clustering in the learning-augmented setting. We are given a data set in d-dimensional Euclidean space, and a label for each data point given by a predictor indicating what subsets of points should be clustered together. … For a dataset of size m, we propose a deterministic k-means algorithm that produces centers with aimproved bound on the clustering cost compared to the previous randomized state-of-the-art algorithm while preserving the O(dm log m) runtime.” Find the paper and the full list of authors at Open Review.
-
‘Scheduling Under Non-Uniform Job and Machine Delays’
“We study the problem of scheduling precedence-constrained jobs on heterogenous machines in the presence of non-uniform job and machine communication delays. We are given a set of n unit size precedence-ordered jobs, and a set of m related machines each with size m_i (machine i can execute at most m_i jobs at any time). … The objective is to construct a schedule that minimizes makespan, … the maximum completion time over all jobs. We consider schedules which allow duplication of jobs as well as schedules which do not.” Find the paper and full list of authors at Dagstuhl Research Online…
-
‘ThreadLock: Native Principal Isolation Through Memory Protection Keys’
“Inter-process isolation has been deployed in operating systems for decades, but secure intra-process isolation remains an active research topic. Achieving secure intra-process isolation within an operating system process is notoriously difficult. However, viable solutions that securely consolidate workloads into the same process have the potential to be extremely valuable. In this work, we present native principal isolation, a technique to restrict threads’ access to process memory by enforcing intra-process security policies defined over a program’s application binary interface (ABI).” Find the paper and full list of authors in the 2023 ACM Asia Conference on Computer and Communications Security proceedings.
-
Grant from Broad Institute to combat antibiotic failure
Titled “Attacking Failure of Antibiotic Treatment by Targeting Antimicrobial Resistance Enabler Cell-States,” professor of biology Edward Geisinger writes that “This project aims to uncover the genetic mechanisms that underlie antibiotic treatment failure in hospital-acquired bacterial infections. We will analyze ‘enabler’ mutations and phenotypes that promote antibiotic tolerance and act as stepping stones for the development of antibiotic resistance and treatment failure. A major focus is the pathogen Acinetobacter baumannii, which causes hospital-acquired diseases including pneumonia and sepsis that have become increasingly difficult to treat.”
-
Grant from National Institutes of Health to combat drug-resistant pathogens
The project, titled “Repurposing Gram-Positive Antibiotics for Gram-Negative Bacteria Using Antibiotic Adjuvants,” studies “The multidrug-resistant (MDR) sepsis pathogen Acinetobacter baumanni,” writes professor of biology Edward Geisinger. “Current treatment options for infections with these bacteria are extremely limited. Our research examines a class of small molecules called antibiotic adjuvants that greatly boost the activity of several existing antibiotics against A. baumanniim, with the goal of developing new combination approaches to treat MDR infections.”
-
Making AI more secure with privacy-preserving machine learning
“Electrical and computer engineering assistant professor Xiaolin Xu, in collaboration with Wujie Wen from Lehigh University and Caiwen Ding from the University of Connecticut, was awarded a $1.2M NSF grant for ‘Accelerating Privacy-Preserving Machine Learning as a Service: From Algorithm to Hardware.'”
-
NSF CAREER Award to protect AI-enabled systems from attack
“Electrical and computer engineering assistant professor Xiaolin Xu was awarded a $600,000 NSF CAREER Award for ‘Securing Reconfigurable Hardware Accelerator for Machine Learning: Threats and Defenses.'”
-
Securing scientific cyberinfrastructures from advanced attacks
“Electrical and computer engineering assistant professor Xiaolin Xu is leading a $1.2 million NSF grant, in collaboration with professor of electrical and computer engineering Miriam Leeser and Mike Zink from the University of Massachusetts, for ‘CAREFREE: Cloud infrAstructure ResiliencE of the Future foR tEstbeds, accelerators and nEtworks.'”
-
‘Flourishing in the Everyday: Moving Beyond Damage-Centered Design in HCI for BIPOC Communities’
“Research and design in human-computer interaction centers problem-solving, causing a downstream effect of framing work with and for marginalized communities predominantly from the lens of deficit and damage. … However, we observe an additional need to center positive aspects of humanity, such as joy, pleasure, rest, and cultural heritage, particularly for Black, Indigenous, and People of Color. In this paper, we present three case studies of existing technologies that center BIPOC flourishing to provide an alternative path for HCI.” Find the paper and the full list of authors in the 2023 ACM Designing Interactive Systems Conference proceedings.
-
‘That’s a Tough Call: Studying the Challenges of Call Graph Construction for WebAssembly’
“WebAssembly is a low-level bytecode format that powers applications and libraries running in browsers, on the server side, and in standalone runtimes. Call graphs are at the core of many interprocedural static analysis and optimization techniques. However, WebAssembly poses some unique challenges for static call graph construction. … This paper presents the first systematic study of WebAssembly-specific challenges for static call graph construction and of the state-of-the-art in call graph analysis.” Find the paper and the full list of authors in the Proceedings of the 32nd ACM SIGSOFT International Symposium on Software Testing and Analysis.
-
‘Systematic Comparisons Between Lyme Disease and Post-Treatment Lyme Disease Syndrome in the U.S. With Administrative Claims Data’
“Post-treatment Lyme disease syndrome (PTLDS) is used to describe Lyme disease patients who have the infection cleared by antibiotic but then experienced persisting symptoms of pain, fatigue, or cognitive impairment. Currently, little is known about the cause or epidemiology of PTLDS. … We conducted a data-driven study with a large nationwide administrative dataset, which consists of more than 98 billion billing and 1.4 billion prescription records between 2008 and 2016, to identify unique aspects of PTLDS that could have diagnostic and etiologic values.” Find the paper and the full list of authors at EBioMedicine.
-
‘New Sampling Lower Bounds via the Separator’
“Suppose that a target distribution can be approximately sampled by a low-depth decision tree, or more generally by an efficient cell-probe algorithm. It is shown to be possible to restrict the input to the sampler so that its output distribution is still not too far from the target distribution, and at the same time many output coordinates are almost pairwise independent. This new tool is then used to obtain several new sampling lower bounds and separations, including a separation between AC0 and low-depth decision trees, and a hierarchy theorem for sampling.”
-
‘On Correlation Bounds Against Polynomials’
“We study the fundamental challenge of exhibiting explicit functions that have small correlation with low-degree polynomials over 𝔽₂. Our main contributions include: …2) We propose a new approach for proving correlation bounds with the central ‘mod functions.’ …3) We prove our conjecture for quadratic polynomials. … We express correlation in terms of directional derivatives and analyze it by slowly restricting the direction.4) We make partial progress on the conjecture for cubic polynomials, in particular proving tight correlation bounds for cubic polynomials whose degree-3 part is symmetric.” Find the paper and full list of authors at the Dagstuhl Research Online Publication Server.