1. AI Roguelite
  2. News

AI Roguelite News

Sapphire-tier (ChatGPT-3.5) subscription lowered to $15 per month

If you were previously subscribed to $20, you may need to unsubscribe and wait until the end of the billing period to re-subscribe with the new price.

Merchant restocking and clearer things bonus indication

Merchants now restock items after several turns.

Player-built things positively influence their area as well as immediately adjacent areas. These bonuses are now displayed in an indicator.

Traps

Traps:
  • Set any Thing as a trap and arm it by providing the required ingredient. Each trap independently has a 50% chance of triggering when an enemy attacks you while in the area.


Misc:
  • Minimum level requirement for main quest: This incentivizes the player to explore further. Can be turned off under options
  • Fixed issue where only str and dex were generated as item attributes
  • Customizable negative prompt for local stable diffusion and novel ai image gen
  • Speculative fix for local stable diffusion periodically not working, by simplifying and using an sd-dedicated python exe as opposed to a wrapper exe

Blocked Places

Blocked places: Each place has a chance of spawning a blocked/locked place which can be unblocked via items/abilities for XP. These places also have a level jump which serves as a means to get to a higher-level area more quickly.

Misc:
Fixed stable diffusion not working
Tentative fix and extra debug statements for an issue with local A1111 stable diffusion only affecting a subset of users

A1111 for local image gen, better local text gen support

This update is mainly geared towards people who generate images or text locally using their NVIDIA GPU.

Overhauled the local image gen to use A1111 and its API. It generates images much faster than the original stable diffusion, and with less VRAM required!

Oobabooga API was not suitable for my text gen purposes because of the overhead in traditional event checks, so instead of going with that, I fixed some code in gpt_server to autoload the models better, so more models should be supported than before.

Image gen and text gen are also a little more transparent than before with the python code exposed under the "ai" directory, and being run via an included standalone python exe. This may also make it slightly easier to debug issues.

I also implemented preliminary support for Llama-based chat models, but this has only been briefly tested on my own computer, and only with the model Aitrepreneur/wizardLM-7B-GPTQ-4bit-128g which is small enough to fit on my GPU.