Author: Nathan Horrocks
Translation: Gabriel Ng
Proofreading: Zhang Ruiyi
This article is about 3300 words long and is suggested to be read in 5 minutes.
This article introduces how to use AI for high-precision image editing.
-
Extremely high-precision editing;
-
Requires very low annotated training data (no external classifiers needed);
-
Can run and interact in real time;
-
Allows direct combination of multiple edits;
-
Can handle real-time embedded images, GAN-generated images, and even images beyond the set range.
“The framework allows us to learn editing vectors, where the number of vectors can be set freely, enabling these editing vectors to directly adapt to other images in a mutually feedback manner,” the researchers explained in their study. “We demonstrated that EditGAN can manipulate image details and freedom to a degree previously unattainable while maintaining the integrity of the image quality. We can also easily merge multiple edits and perform reasonable editing operations not present in the training data of EditGAN. We showcased the advantages of EditGAN across various image types and quantitatively analyzed its super-level performance compared to several previous editing methods in standard editing reference tasks.”
From adding various smiles, changing a person’s gaze direction, creating a new hairstyle, to giving a car a better set of wheels, the researchers demonstrated how much intrinsic detail a model can extract with minimal data annotation. Users can sketch a rough outline or map the parts they want to edit, guiding the AI model to understand the modification task, such as enlarging cat ears or making car headlights cooler. This AI then renders the image while maintaining high accuracy and preserving the quality of the original image. Subsequently, the same edits can be applied to other real-time images.
Fig 2 An example of pixel allocation to different parts of the image. The AI identifies these different parts and makes edits based on human input.
For more information about this magical method, please check the paperhttps://arxiv.org/pdf/2111.03186.pdf.
Translator’s Profile

Gabriel Ng, an undergraduate student majoring in probability and statistics at Tsinghua University, is passionate about data analysis and language learning (and music). His daily activities revolve around studying, fitness, and music. He enjoys exploring the essence of various problems through data mining and understanding different cultural stories through language. He hopes to analyze problems rationally and understand them emotionally through learning and experience accumulation from different perspectives.
Translation Team Recruitment Information
Job Content: A meticulous heart is needed to translate selected foreign articles into fluent Chinese. If you are an international student in data science/statistics/computer-related fields, or working abroad in related fields, or confident in your language skills, you are welcome to join the translation team.
What You Can Get: Regular translation training to improve volunteers’ translation skills, enhance understanding of cutting-edge data science, and overseas friends can stay connected with domestic technology application development. The THU Data Team’s industry-academia-research background provides good development opportunities for volunteers.
Other Benefits: You will have partners from data science professionals from well-known companies and students from prestigious universities such as Peking University and Tsinghua University, as well as overseas institutions.
Click the “Read Original” at the end of the article to join the Data Team~
Reprint Notice
If you need to reprint, please indicate the author and source prominently at the beginning (Reprinted from: Data Team ID: DatapiTHU), and place a prominent QR code for Data Team at the end of the article. For articles with original identification, please send the [Article Name – Waiting for Authorization Public Account Name and ID] to the contact email to apply for whitelist authorization and edit according to requirements.
After publication, please feedback the link to the contact email (see below). Unauthorized reprints and adaptations will be pursued legally.
Click “Read Original” to embrace the organization.