Since the last update on the free Tencent Cloud GPU computing power, which offers 10,000 free minutes per month for a limited time, many friends have asked if it can be used to deploy model applications. After studying for two nights, I finally managed to deploy <span>flowise</span>
, an open-source AI large model workflow tool that I have been researching recently.
For an introduction to <span>flowise</span>
, you can check out its open-source repository
https://github.com/FlowiseAI/Flowise
First, let’s take a look at the effect diagram, built using the <span>cloud studio</span>
‘Hunyuan Dit’ template.
Then I simply set up a <span>Chatflows</span>
using <span>ollama</span>
and ran it.

The entire application setup used <span>81G</span>
of disk space, which is quite large. Currently, testing shows that only the <span>Hunyuan Dit</span>
template has this much space; other templates are around <span>50G</span>
.

However, this template has a downside: many system commands or environments need to be installed manually. Although it seems to be using the <span>ubuntu</span>
system, some system-level commands just fail to debug, which is a bit strange.
Due to space limitations and some indescribable reasons, if anyone is interested, you can visit my blog for a detailed installation tutorial and reply with γflowiseγ