Atomiqx Technologies Co.

Book A Consultation

To build a better AI helper, start by modeling the irrational behavior of humans

To build AI systems that can collaborate effectively with humans, it helps to have a good model of human behavior to start with. But humans tend to behave suboptimally when making decisions. This irrationality, which is especially difficult to model, often boils down to computational constraints. A human can’t spend decades thinking about the ideal […]

Artificial intelligence enhances air mobility planning

Every day, hundreds of chat messages flow between pilots, crew, and controllers of the Air Mobility Command’s 618th Air Operations Center (AOC). These controllers direct a thousand-wide fleet of aircraft, juggling variables to determine which routes to fly, how much time fueling or loading supplies will take, or who can fly those missions. Their mission […]

Biological data studies, scale-up the potential with machine learning

As the understanding of a biological system advances, an enormous amount of data is generated on a daily basis. This vast data come from various input sources, for example, imaging data via high-throughput microscopic analysis in cell and developmental biological field and large-scale genomic-wide association studies, and so on [4]. Though manually handling these large […]

Three types of incremental learning

All experiments were run using custom-written code for the Python machine learning framework PyTorch65. Context sets For the Split MNIST protocol, the MNIST dataset66 was split into five contexts, such that each context contained two digits. The digits were randomly divided over the five contexts, so the order of the digits was different for each […]

Communication-efficient federated learning via knowledge distillation

%PDF-1.4 % 1 0 obj endobj 2 0 obj stream application/pdf doi:10.1038/s41467-022-29763-x Communication-efficient federated learning via knowledge distillation Chuhan Wu Fangzhao Wu Lingjuan Lyu Yongfeng Huang Xing Xie Springer US Nature Communications, doi:10.1038/s41467-022-29763-x Springer 2022-04-11T11:19:45+02:00 2022-04-09T13:02:01+05:30 2022-04-11T11:19:45+02:00 True iText® 5.3.5 ©2000-2012 1T3XT BVBA (SPRINGER SBM; licensed version) uuid:9b736aed-3b37-409c-9062-7d75abd853da uuid:a43285da-d735-4450-8890-f7ab85fa3305 default 1 converted uuid:0aa1cef4-1f76-48d8-890d-29c9f0032245 converted to […]

Facilitated the discovery of new γ/γ′ Co-based superalloys by combining first-principles and machine learning

Computational determination of energy datasets It is imperative to conduct a computational reliability analysis before evaluating any set of data derived from theoretical calculations. The reliability of DFT in calculating energies depends on the choice of the exchange-correlation functional that approximates the complex many-body interactions of electrons. Perdew-Burke-Ernzerhof (PBE) is one of the standard functionals […]

Collaborative and privacy-preserving retired battery sorting for profitable direct recycling via federated machine learning

Data collection and standardization The unique battery kinetics in different battery types are often high-dimensional and hard to characterize due to divergent operating cases, manufacturing variability, and historical usages52. To find a solution to this dilemma, we collected and standardized 130 retired batteries with 5 cathode material types from 7 manufacturers to construct an out-of-distribution, […]

Communication-efficient federated learning via knowledge distillation

In this section, we introduce the details of our communication-efficient federated learning approach based on knowledge distillation (FedKD). We first present a definition of the problem studied in this paper, then introduce the details of our approach, and finally present some discussions on the computation and communication complexity of our approach. Problem definition In our […]

The AI revolution is running out of data. What can researchers do?

Download the 31 January long read podcast The explosive improvement in artificial intelligence (AI) technology has largely been driven by making neural networks bigger and training them on more data. But experts suggest that the developers of these systems may soon run out of data to train their models. As a result, teams are taking […]