
Note: This upscaler is not mine
Credit to Helaman
Originally uploaded at: https://openmodeldb.info/models/4x-Nomos8kHAT-L-otf
About version 2.0
Everything is the same is with the first upload, but converted to .safetensors. I had issues getting Forge and Automatic1111 to load the .safetensors version of the upscaler but it works like a charm in ComfyUI.
General info
Hybrid Attention Transformer (HAT) combines channel attention and self-attention schemes and makes use of their complementary advantages. To enhance the interaction between neighboring window features, an overlapping cross-attention module is employed in HAT. Read more
Where does it go?
To use this (and other HAT upscalers) with Automatic1111 and Forge follow these steps.
-
Create a folder in in \webui\models\ and name it HAT
-
Download the file either here or from the source
-
Place the file in \webui\models\HAT\
-
Restart your webui
Note: If you have issues getting the model to work, change the file name from .pt to .pth
描述:
训练词语:
名称: nomos8khatLOtf_v10.pt
大小 (KB): 161912
类型: Model
Pickle 扫描结果: Success
Pickle 扫描信息: No Pickle imports
病毒扫描结果: Success