Your Cart

New Models and Datasets

I’ve been very busy the past two weeks trying to put a few things together. I’m still doing a lot of experimentation so none of this is really groundbreaking. Here is what I have so far.

Cockatrice-v0.3, trained on pure roleplay chat data. I’ve further refined the dataset and fixed the tokenization of the delimiter tokens. Thanks to the community for pointing out my mistake, I am still learning how everything fits together. 

Gorgon-v0.1 is a combination of freedom-rp and erotica-anaylsis-16k. I wanted a summarization model that I could use in mergkit mistral merges, and I decided to include the multiturn-rp chat in hopes of improving it’s ability to do follow up-rewrite responses. Personally I’d probably skip over this one as a daily driver, it’s more of a utility/experimental model.

I’ve also created a new erotiquant dataset, erotiquant2. This should be even more clean and pruned than the previous erotiquant. It’s based off of the lamia dataset I’ve also recently released.

In Greek mythology, a lamia is a hermaphroditic serpentine demon that eats children and seduces men. In dataset terms, it included pruned roleplay chats from freedom-rp (keeping mostly only long 16k context chats where most of the jailbreaks have been manually removed as well as the smaller context utility/SEO data), a subset of erotica-analysis-16k, and a subset of Airoboros data. The Airoboros data includes the following categories: “roleplay”, “unalignment”, “editor”, “writing”, “detailed_writing”, “stylized_response”, “unalign”, “cot”,  and “song”. I’m hoping that it can serve as both a good general model for roleplay chats, as well as a solid model for helping me create long context training data. 

 Keep in mind that lamia-mistral models I release are based on the Yarn-Mistral-7b-128k model, not vanilla mistral. In this version, the sliding attention window is effectively turned off (set to the max context size), which greatly improves the quality of long context summarization/analysis. If your machine is starved for VRAM, I do believe it is possible to turn the sliding attention window back on, but I haven’t experimented with that yet. 

The newest version of the Lamia dataset also includes pippa_scored2sharegpt, a modified version of pippa_scored  in which I’ve reformatted the data as sharegpt training data. lamia-v0.1 was not trained on this edition of the dataset, but the next version is currently training on the full new dataset which includes it. 

If it’s not obvious, one of my primary goals is to create a model that lets me break free from using OpenAI API to create NSFW training data. Manually processing the censored outputs and dealing with the costs associated is just not sustainable. 

Lastly, I’ve greatly improved the selection of goods available on my etsy store, and I’ve also stocked several fragrances which can be added to any of the available products. Valentine’s Day is around the corner, so whether you want something cute and classy, or a monstrously large shlong of a gag gift, I’ve got you covered. 


Join my discord server if you have any constructive criticism or suggestions for how I should be moving forward. 


Leave a Reply

Your email address will not be published. Required fields are marked *