Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It isn't reasonable to compare the 7B model with the likes of ChatGPT 3.5 which has 175 billion parameters. Stability does have plans to produce a comparable model, though. Right now they're working on models up to 65 billion parameters.

This is just the very early stages of development, and the exciting thing is that it's something you can actually run yourself and it's freely usable for commercial use.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: