STOCK TITAN

Media Alert: BrainChip Introduces aTENNuate in New Technical Paper

Rhea-AI Impact
(Neutral)
Rhea-AI Sentiment
(Very Positive)
Tags

BrainChip Holdings (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY) has released a technical paper introducing aTENNuate, a lightweight deep state-space autoencoder algorithm for raw audio denoising, super resolution, and de-quantization optimized for edge computing. This advancement expands BrainChip's leadership in neuromorphic event-based computing.

aTENNuate belongs to the class of state space-based networks called Temporal Neural Networks (TENNs). It can capture long-range temporal relationships in speech signals, useful for global speech patterns, noise profiles, and potentially semantic contexts. This technology complements BrainChip's Akida IP, an event-based neural processor that offers lower power consumption compared to conventional neural network accelerators.

BrainChip's event-based AI technology is expected to impact various markets, including wearable products (earbuds, hearing aids, watches, glasses) and AIoT devices in smart homes, offices, factories, and cities.

BrainChip Holdings (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY) ha pubblicato un documento tecnico che introduce aTENNuate, un algoritmo leggero di autoencoder deep state-space per la rimozione del rumore audio, super risoluzione e de-quantizzazione ottimizzato per il computing edge. Questo progresso espande la leadership di BrainChip nel computing neuromorfico basato su eventi.

aTENNuate appartiene alla classe delle reti basate su spazio di stato chiamate Reti Neurali Temporali (TENNs). È in grado di catturare relazioni temporali a lungo raggio nei segnali vocali, utile per i modelli vocali globali, profili di rumore e potenzialmente contesti semantici. Questa tecnologia completa l'Akida IP di BrainChip, un processore neurale basato su eventi che offre un consumo energetico inferiore rispetto agli acceleratori di reti neurali convenzionali.

Si prevede che la tecnologia AI basata su eventi di BrainChip avrà un impatto su vari mercati, inclusi prodotti indossabili (auricolari, apparecchi acustici, orologi, occhiali) e dispositivi AIoT in case intelligenti, uffici, fabbriche e città.

BrainChip Holdings (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY) ha lanzado un documento técnico que introduce aTENNuate, un algoritmo liviano de autoencoder deep state-space para la eliminación de ruido en audio, super resolución y de-quantización optimizado para la computación en el borde. Este avance amplía el liderazgo de BrainChip en computación neuromórfica basada en eventos.

aTENNuate pertenece a la clase de redes basadas en espacio de estado llamadas Redes Neuronales Temporales (TENNs). Puede capturar relaciones temporales de largo alcance en señales de voz, útil para patrones de habla globales, perfiles de ruido y potencialmente contextos semánticos. Esta tecnología complementa el Akida IP de BrainChip, un procesador neuronal basado en eventos que ofrece un menor consumo de energía en comparación con los aceleradores de redes neuronales convencionales.

Se espera que la tecnología de IA basada en eventos de BrainChip impacte varios mercados, incluidos productos portátiles (auriculares, audífonos, relojes, gafas) y dispositivos AIoT en hogares inteligentes, oficinas, fábricas y ciudades.

BrainChip Holdings (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)가 원시 오디오 노이즈 제거, 슈퍼 해상도 및 디-양자화를 위한 경량 깊이 상태 공간 오토인코더 알고리즘인 aTENNuate를 소개하는 기술 논문을 발표했습니다. 이 발전은 BrainChip의 이벤트 기반 신경형 컴퓨팅에서의 리더십을 확장합니다.

aTENNuate는 시간적 신경망(TENNs)이라고 불리는 상태 공간 기반 네트워크의 클래스에 속합니다. 이는 음성 신호에서 장기적인 시간 관계를 캡처할 수 있으며, 전 세계의 음성 패턴, 노이즈 프로파일 및 잠재적 의미 맥락에 유용합니다. 이 기술은 BrainChip의 Akida IP와 보완 관계에 있으며, 이는 기존의 신경망 가속기보다 낮은 전력 소비를 제공합니다.

BrainChip의 이벤트 기반 AI 기술은 착용 가능한 제품(이어폰, 보청기, 시계, 안경) 및 스마트 홈, 사무실, 공장, 도시의 AIoT 장치 등 다양한 시장에 영향을 미칠 것으로 예상됩니다.

BrainChip Holdings (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY) a publié un document technique présentant aTENNuate, un algorithme d'autoencodeur profond léger de l'espace d'état pour la réduction du bruit audio brut, la super résolution et la déquantification optimisé pour le calcul en périphérie. Cette avancée renforce le leadership de BrainChip dans le domaine du calcul neuromorphique basé sur des événements.

aTENNuate appartient à la classe des réseaux basés sur l'espace d'état appelés Réseaux Neuraux Temporels (TENNs). Il peut capturer des relations temporelles à long terme dans les signaux vocaux, utiles pour les modèles de parole globaux, les profils de bruit et potentiellement les contextes sémantiques. Cette technologie complète le Akida IP de BrainChip, un processeur neuronal basé sur des événements qui offre une consommation d'énergie plus faible par rapport aux accélérateurs de réseaux neuronaux conventionnels.

La technologie IA basée sur des événements de BrainChip devrait avoir un impact sur divers marchés, y compris les produits portables (écouteurs, aides auditives, montres, lunettes) et les dispositifs AIoT dans les maisons intelligentes, les bureaux, les usines et les villes.

BrainChip Holdings (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY) hat ein technisches Papier veröffentlicht, das aTENNuate vorstellt, einen leichten Deep State-Space Autoencoder-Algorithmus zur Geräuschunterdrückung von Rohaudio, Superauflösung und De-Quantisierung, der für Edge-Computing optimiert ist. Dieser Fortschritt erweitert die Führungsposition von BrainChip im Bereich neuromorphen ereignisgesteuerten Rechnens.

aTENNuate gehört zur Klasse der zustandsraum-basierten Netzwerke, die als Temporale Neuronale Netzwerke (TENNs) bezeichnet werden. Es kann langreichweitige zeitliche Beziehungen in Sprachsignalen erfassen, was nützlich ist für globale Sprachmuster, Geräuschprofile und potenziell semantische Kontexte. Diese Technologie ergänzt BrainChips Akida IP, einen ereignisgesteuerten neuronalen Prozessor, der einen geringeren Energieverbrauch im Vergleich zu herkömmlichen neuronalen Netzwerkbeschleunigern bietet.

Man erwartet, dass die ereignisgesteuerte KI-Technologie von BrainChip verschiedene Märkte beeinflussen wird, darunter tragbare Produkte (Kopfhörer, Hörgeräte, Uhren, Brillen) sowie AIoT-Geräte in Smart Homes, Büros, Fabriken und Städten.

Positive
  • Introduction of aTENNuate, a new audio processing algorithm optimized for edge computing
  • Expansion of BrainChip's leadership in neuromorphic event-based computing
  • Potential applications in wearable products and AIoT devices across various sectors
Negative
  • None.

LAGUNA HILLS, Calif.--(BUSINESS WIRE)-- BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, today released a technical paper detailing the company’s aTENNuate, a lightweight deep state-space autoencoder algorithm that can perform raw audio denoising, super resolution and de-quantization optimized for the edge.

“Real-time Speech Enhancement on Raw Signals with Deep State-space Modeling” further expands BrainChip’s leadership in the neuromorphic event-based computing space by showcasing the company’s approach in advancing state-of-the-art deep learning audio denoising methods. The white paper covers topics related to state-space modeling, network architecture, event-based benchmark experiments and future directions that such an approach can achieve.

BrainChip’s aTENNuate network belongs to the class of state space-based networks called Temporal Neural Networks (TENNs). By virtue of being a state-space model, aTENNuate can capture long-range temporal relationships present in speech signals with stable linear recurrent units. Learning long-range correlations can be useful for capturing global speech patterns or noise profiles, and perhaps implicitly capture semantic contexts to aid speech enhancement performance.

aTENNuate is the latest technological advancement added to BrainChip’s IP portfolio and an expansion of Temporal Event-Based Neural Nets (TENNs), the company’s approach to streaming and sequential data. It complements the company’s neural processor, Akida IP, an event-based technology that is inherently lower power when compared to conventional neural network accelerators. Lower power affords greater scalability and lower operational costs. BrainChip’s Akida™ supports incremental learning and high-speed inference in a wide variety of use cases. Among the markets that BrainChip’s event-based AI technology will impact are the next generation of wearable products, such as earbuds, hearing aids, watches and glasses as well as AIoT devices in the smart home, office, factory or city.

Paper is available here: https://bit.ly/3Tz1xyx

Code is available here: https://bit.ly/3MUsLMg

To access additional learning resources regarding neuromorphic AI, essential AI, TENNs and other technical achievements, interested parties are invited to visit BrainChip’s white paper library at https://brainchip.com/white-papers-case-studies/

About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY) BrainChip is the worldwide leader in Edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, AkidaTM, uses neuromorphic principles to mimic the human brain, analyzing only essential sensor inputs at the point of acquisition, processing data with unparalleled efficiency, precision, and economy of energy. Akida uniquely enables Edge learning local to the chip, independent of the cloud, dramatically reducing latency while improving privacy and data security. Akida Neural processor IP, which can be integrated into SoCs on any process technology, has shown substantial benefits on today’s workloads and networks, and offers a platform for developers to create, tune and run their models using standard AI workflows like Tensorflow/Keras. In enabling effective Edge compute to be universally deployable across real world applications such as connected cars, consumer electronics, and industrial IoT, BrainChip is proving that on-chip AI, close to the sensor, is the future, for its customers’ products, as well as the planet. Explore the benefits of Essential AI at www.brainchip.com.

Follow BrainChip on Twitter: https://www.twitter.com/BrainChip_inc

Follow BrainChip on LinkedIn: https://www.linkedin.com/company/7792006

Media Contact:

Mark Smith

JPR Communications

818-398-1424

Investor Relations:

Tony Dawe

Director, Global Investor Relations

tdawe@brainchip.com

Source: BrainChip Holdings Ltd

FAQ

What is aTENNuate and how does it relate to BrainChip's BRCHF stock?

aTENNuate is a lightweight deep state-space autoencoder algorithm developed by BrainChip (BRCHF) for raw audio denoising, super resolution, and de-quantization optimized for edge computing. It represents a technological advancement in BrainChip's IP portfolio, potentially impacting the company's market position and stock value.

How does aTENNuate technology benefit BrainChip's BRCHF products?

aTENNuate complements BrainChip's (BRCHF) Akida IP, an event-based neural processor. It enhances the company's capabilities in processing streaming and sequential data, offering lower power consumption and greater scalability, which could lead to improved product performance and market competitiveness.

What markets could BrainChip's BRCHF stock benefit from with the new aTENNuate technology?

BrainChip's (BRCHF) aTENNuate technology could potentially benefit markets for wearable products such as earbuds, hearing aids, watches, and glasses, as well as AIoT devices in smart homes, offices, factories, and cities. This broad application range could positively impact the company's market reach and stock performance.

How does the release of the aTENNuate technical paper affect BrainChip's BRCHF market position?

The release of the aTENNuate technical paper reinforces BrainChip's (BRCHF) leadership in neuromorphic event-based computing. It demonstrates the company's ongoing innovation and could potentially strengthen its market position, attracting investor interest and possibly influencing the stock's value.

BRAINCHIP HLDGS LTD ORD

OTC:BRCHF

BRCHF Rankings

BRCHF Latest News

BRCHF Stock Data

302.66M
1.44B
15.82%
5.99%
Semiconductors
Technology
Link
United States of America
Sydney