Nomos8kHAT-L_otf版本v1.0 (ID: 488342)

Nomos8kHAT-L_otf版本v1.0 (ID: 488342)

Note: This upscaler is not mine

Credit to Helaman

Originally uploaded at: https://openmodeldb.info/models/4x-Nomos8kHAT-L-otf

About version 2.0

Everything is the same is with the first upload, but converted to .safetensors. I had issues getting Forge and Automatic1111 to load the .safetensors version of the upscaler but it works like a charm in ComfyUI.

General info

Hybrid Attention Transformer (HAT) combines channel attention and self-attention schemes and makes use of their complementary advantages. To enhance the interaction between neighboring window features, an overlapping cross-attention module is employed in HAT. Read more

Where does it go?

To use this (and other HAT upscalers) with Automatic1111 and Forge follow these steps.

  • Create a folder in in \webui\models\ and name it HAT

  • Download the file either here or from the source

  • Place the file in \webui\models\HAT\

  • Restart your webui

Note: If you have issues getting the model to work, change the file name from .pt to .pth

描述:

训练词语:

名称: nomos8khatLOtf_v10.pt

大小 (KB): 161912

类型: Model

Pickle 扫描结果: Success

Pickle 扫描信息: No Pickle imports

病毒扫描结果: Success

Nomos8kHAT-L_otf

Nomos8kHAT-L_otf

Nomos8kHAT-L_otf

Nomos8kHAT-L_otf

Nomos8kHAT-L_otf

资源下载
下载价格VIP专享
仅限VIP下载升级VIP
犹豫不决让我们错失一次又一次机会!!!
原文链接:https://1111down.com/1034127.html,转载请注明出处
由于网站升级,部分用户密码全部设置为111111,登入后自己修改, 并且VIP等级提升一级(包月提升至包季,包季提升到包年 包年提升至永久)
没有账号?注册  忘记密码?

社交账号快速登录