Can't wait for the day I can actually try a diffusion model on my own machine (128GB M4 Max) rather than as a hosted service. So far I haven't seen a single piece of software that supports it.
You can try it today. You can get them from huggingface. Here is an example:
https://huggingface.co/tencent/WeDLM-8B-Instruct
Diffusion isn’t natively supported in the transformers library yet so you have to use their custom inference code.
You can try it today. You can get them from huggingface. Here is an example:
https://huggingface.co/tencent/WeDLM-8B-Instruct
Diffusion isn’t natively supported in the transformers library yet so you have to use their custom inference code.