Just tried the new open-source 20b parameter OpenAPI gpt-oss model on my laptop. Here’s what I got.
Have no idea why it needed to generate code for a multi-threaded Fibonacci calculator! Funny part is it did all that, then just loaded the original multiplication request into Python and printed out the result.
On the plus side, the performance was pretty decent.
You must log in or # to comment.
I don’t know the right terminology to ask about this knowledgeably, so I apologize in advance.
How did your model reach out to Python? How are you running the model?
Mystery was solved. I wrote about it in a separate comment, here: https://lemmy.world/comment/18655333