Skip to content
Navigation menu
Search
Powered by Algolia
Search
Log in
Create account
DEV Community
Close
#
moe
Follow
Hide
Posts
Left menu
đź‘‹
Sign in
for the ability to sort posts by
relevant
,
latest
, or
top
.
Right menu
Mixture of Experts
Krun_pro
Krun_pro
Krun_pro
Follow
Apr 1
Mixture of Experts
#
mixture
#
moe
#
gpu
1
 reaction
Comments
Add Comment
3 min read
NVIDIA's Nemotron 3 Super: 120B Parameters, 12B Active. Why That Matters.
Alan West
Alan West
Alan West
Follow
Mar 27
NVIDIA's Nemotron 3 Super: 120B Parameters, 12B Active. Why That Matters.
#
nvidia
#
nemotron
#
moe
#
localinference
Comments
Add Comment
4 min read
Run a Local Neuro Sama with Just **1 Line of Code** Using the LivinGrimoire software design pattern
owly
owly
owly
Follow
Mar 8
Run a Local Neuro Sama with Just **1 Line of Code** Using the LivinGrimoire software design pattern
#
moe
#
python
#
ai
#
designpatterns
2
 reactions
Comments
Add Comment
2 min read
The CoderPunk Guide to Mixture of Experts: Requipping AI Like Fairy Tail's Elza
owly
owly
owly
Follow
Feb 27
The CoderPunk Guide to Mixture of Experts: Requipping AI Like Fairy Tail's Elza
#
designpatterns
#
python
#
ai
#
moe
2
 reactions
Comments
Add Comment
4 min read
đź‘‹
Sign in
for the ability to sort posts by
relevant
,
latest
, or
top
.
We're a place where coders share, stay up-to-date and grow their careers.
Log in
Create account