Advancements in Fairness Research for Image Recognition

Advancements in Fairness Research for Image Recognition

This is the664th original article. Your attention is the greatest encouragement for us! Keywords: fairness, bias, debiasing learning, image recognition, deep learning Algorithmic fairness is one of the important themes in the benevolent development of artificial intelligence and a key component of trustworthy AI. Establishing reasonable models to ensure unbiased algorithmic decision-making is a necessary … Read more

Speech Recognition Method Based on Multi-Task Loss with Additional Language Model

Speech Recognition Method Based on Multi-Task Loss with Additional Language Model

Click the blue text to follow us DOI:10.3969/j.issn.1671-7775.2023.05.010 Open Science (Resource Service) Identifier Code (OSID): Citation Format: Liu Yongli, Zhang Shaoyang, Wang Yuheng, et al. Speech Recognition Method Based on Multi-Task Loss with Additional Language Model[J]. Journal of Jiangsu University (Natural Science Edition), 2023, 44(5):564-569. Fund Project: Shaanxi Provincial Key Industry Innovation Chain (Group) Project … Read more

A Brief History of Speech Recognition Technology

A Brief History of Speech Recognition Technology

[CSDN Editor’s Note] Since its inception over half a century ago, speech recognition has remained somewhat dormant until the significant advancements in deep learning technology in 2009 greatly improved its accuracy. Although it still cannot be applied in unrestricted domains and among unlimited populations, it has provided a convenient and efficient means of communication in … Read more

OPT Smart Code Reader with Deep Learning OCR Technology

OPT Smart Code Reader with Deep Learning OCR Technology

The domestic code reader market has a wide variety of products, but few have ventured into the application of OCR (Optical Character Recognition) technology. OPT has launched a code reader product equipped with a deep learning OCR algorithm, leveraging years of profound technological accumulation, pushing the application of code readers into more refined and complex … Read more

What Is AIGC and Its Applications?

What Is AIGC and Its Applications?

AIGC stands for Artificial Intelligence Generated Content, which is considered a new form of content creation following Professionally Generated Content (PGC) and User Generated Content (UGC). Currently, it is mainly used in text, images, videos, audio, games, and virtual humans. Specifically, it includes: 1) Text Creation. AIGC generated text is mainly applied in news writing, … Read more

Computer English Vocabulary (Part One)

Computer English Vocabulary Computer English is a required course for computer science students. To master Computer English, it is essential to continuously learn new computer technologies and professional vocabulary to enhance one’s expertise. In this issue, let us learn about relevant vocabulary in Computer English! 01 abridgement UK [əˈbrɪdʒmənt] US [əˈbrɪdʒmənt] n. abbreviation; reduction [Example]: … Read more

Comprehensive Guide to DeepSeek: 90% of Users Don’t Know These Tips

Comprehensive Guide to DeepSeek: 90% of Users Don't Know These Tips

1. What is DeepSeek Recently, a super dark horse in the AI field has emerged, and that is DeepSeek, officially known as Hangzhou DeepSeek Artificial Intelligence Basic Technology Research Co., Ltd. Although it was established on July 17, 2023, making it a “young player,” it has already stirred up waves in the AI arena. The … Read more

Overview of 17 Efficient Variants of Transformer Models

Overview of 17 Efficient Variants of Transformer Models

Follow the public account “ML_NLP“ Set as “Starred” for heavy content delivered first-hand! Reprinted from | Xiaoyao’s Cute Selling House Written by | Huang Yu Source | Zhihu In the field of NLP, transformer has successfully replaced RNNs (LSTM/GRU), and has also found applications in CV, such as object detection and image annotation, as well … Read more

Introduction to Attention Mechanisms in Three Transformer Models and PyTorch Implementation

Introduction to Attention Mechanisms in Three Transformer Models and PyTorch Implementation

This article delves into three key attention mechanisms in Transformer models: self-attention, cross-attention, and causal self-attention. These mechanisms are core components of large language models (LLMs) like GPT-4 and Llama. By understanding these attention mechanisms, we can better grasp how these models work and their potential applications. We will discuss not only the theoretical concepts … Read more

What Is the Transformer Model?

What Is the Transformer Model?

Welcome to the special winter vacation column “High-Tech Lessons for Kids” presented by Science Popularization China! Artificial intelligence, as one of the most cutting-edge technologies today, is rapidly changing our lives at an astonishing pace. From smart voice assistants to self-driving cars, from AI painting to machine learning, it opens up a future full of … Read more