I've been looking at the code on their chat playground, https://chat.inceptionlabs.ai/, and they have a helper function `const convertOpenAIMessages = (convo) => { ... }`, which also contains `models: ['gpt-3.5-turbo']`. I also see in API response: `"openai": true`. Is it actually using OpenAI, or is it actually calling its dLLM? Does anyone know?
Also: you can turn on "Diffusion Effect" in the top-right corner, but this just seems to be an "animation gimmick" right?
replies(1):