Optimizing Neural Networks with MorphNet from Google AI

Compiled by Yu Yang | QbitAI Official Account

Want to adjust your neural network to complete specific tasks? It’s not as simple as it seems.

Deep Neural Networks (DNNs) are great building blocks, but moving them can be very costly in terms of computational resources and time.

Now, Google AI has released MorphNet. After testing it with the popular image classification neural network Inception V2, they found that the neural network became faster and smaller without sacrificing accuracy, while also reducing computational resource consumption!

What is MorphNet

MorphNet is a neural network model optimization technique that aims to optimize existing architectures for specific tasks.

In other words, this is a transfer learning problem. The challenge of transfer learning lies in identifying invariants; the model needs to handle many tasks that are similar but not identical to the previously trained task objectives, which can significantly degrade model performance or even cause failure.

The success of MorphNet is that as long as a neural network built for a similar problem is used as input, it can create a smaller, faster, and more suitable new architecture for the new task.

Optimizing Neural Networks with MorphNet from Google AI

MorphNet optimizes neural networks through two phases: contraction and expansion.

Contraction Phase

In the contraction phase, MorphNet identifies inefficient neurons and uses a sparse regularizer to prune them.

It is important to note that MorphNet calculates the loss of a neuron considering the target resources, so during the training process, the optimizer can recognize resource loss and identify which neurons are efficient and which can be removed.

Not clear? Let’s take a look at this example to see how MorphNet calculates the computational cost of a neural network (such as FLOPs, or floating-point operations per second):

Assume a neural network layer represented as matrix multiplication, which has 2 inputs (Xn), 6 weights (a, b, …, f), and 3 outputs (Yn; neurons). This means evaluating this layer requires 6 multiplications.

Optimizing Neural Networks with MorphNet from Google AI

MorphNet considers the number of multiplications as the product of the number of inputs and outputs. In the left example, although two weights are zero, all multiplications still need to be performed during evaluation. However, the middle example shows the sparsity of the structure, where MorphNet can identify that its output count is 2, and the number of multiplications for this layer has decreased from 6 to 4. Following this idea, MorphNet can determine the incremental cost of each neuron in the network to produce a more efficient model like the one on the right.

Expansion Phase

In the expansion phase, MorphNet uses width multipliers to uniformly expand the size of all layers.

For example, if expanded by 50%, for inefficient layers, the neurons shrink from 100 to 10, and then only expand back to 15; while for important layers, the neurons only shrink from 100 to 80, and after re-expansion, may reach 120, gaining more available resources.

In other words, the final effect of MorphNet is to reallocate computational resources from the inefficient parts of the network to the efficient parts.

Optimizing Neural Networks with MorphNet from Google AI

MorphNet Optimizes AI Models

How Effective Is It

The Google AI team trained the Inception V2 network model using MorphNet.

Optimizing Neural Networks with MorphNet from Google AI

Image from the MorphNet Paper

The baseline method uses width multipliers to uniformly reduce the output number of each convolution, balancing accuracy and computational consumption (red line).

The MorphNet method, on the other hand, directly targets computational consumption, generating a better trade-off curve when contracting the model (blue line).

At the same accuracy level, the MorphNet method reduced computational consumption by 11% to 15%.

MorphNet has shown excellent performance in optimizing Inception V2 and is also effective for other network models. Optimizing Neural Networks with MorphNet from Google AI

Image from the MorphNet Paper

It successfully compressed the model size/FLOPs with almost no loss in quality; truly, Google products are top-notch.

This highly effective tool is already being used by Google. The Google AI team stated that MorphNet has been applied to several production-scale image processing models at Google.

Links

MorphNet is now open-source.

GitHub link: https://github.com/google-research/morph-net

Paper link: https://arxiv.org/pdf/1711.06798.pdf

The author is a signed author of NetEase News – NetEase Account “Various Attitudes”

End

Event Registration | Multimodal Video Person Recognition

Optimizing Neural Networks with MorphNet from Google AI

Join the Community

The QbitAI community is now recruiting. The QbitAI community consists of: AI discussion group, AI + industry group, AI technology group;

Students interested in AI are welcome to reply with the keyword “WeChat group” in the dialogue interface of the QbitAI official account to obtain group joining methods. (Technical groups and AI + industry groups require review, and the review is strict, please understand)

Optimizing Neural Networks with MorphNet from Google AI

QbitAI QbitAI · Signed Author on Toutiao

v’ᴗ’ v Tracking new trends in AI technology and products

If you like it, please click “Like”!

Leave a Comment