Axolotl

Accelerate AI Training with Axolotl on Novita AI

Framework
axolotl ai cloud2025-09-29Github

One click deployment

On Demand
Deploy
gpu hot
README

\n\n## What is Axolotl?\n\nAxolotl is an AI model fine-tuning tool designed to streamline the fine-tuning process of various AI models, supporting multiple configurations and architectures.\n\n## Key Features and Benefits of Axolotl\n\n### Key Features\n\n- Different Models Support: Support for training various Huggingface models, including Llama, Pythia, Falcon, and MPT.\n \n- Fine-Tuning Methods: Support for diverse fine-tuning techniques such as FullFineTune, LoRA, QLoRA, ReLoRA, and GPTQ.\n \n- Customizable Configurations: Easily customize configurations using simple YAML files or CLI options.\n \n- Flexible Dataset Handling: Compability to Load different dataset formats, use custom formats, or bring your own tokenized datasets.\n \n- Enhanced Performance Integration: Integrate with xformer, Flash Attention, RoPE scaling, and MultiPacking for improved performance.\n \n\n### Benefits\n\n- Scalability: Axolotl allows for easy scaling of resources to accommodate varying workloads, making it suitable for both small and large-scale applications.\n \n- Cost Efficiency: By leveraging GPU resources in the cloud, users can optimize costs by only paying for what they use, avoiding the need for expensive on-premises hardware.\n \n- Performance: Axolotl is designed to maximize the performance of GPU resources, enabling faster processing times for data-intensive tasks such as machine learning and simulations.\n \n- Flexibility: Users can easily configure and customize their environments to meet specific needs, allowing for a wide range of applications from research to production.\n \n\n## Who will Use Axolotl?\n\n- AI Researchers and Data Scientists: Individuals who are focused on developing, fine-tuning, and experimenting with large-scale machine learning models. Axolotl provides them with the flexibility and tools needed to efficiently train and optimize these models.\n \n- Machine Learning Engineers: Professionals who need a scalable and efficient solution to train and deploy machine learning models. Axolotl's support for distributed training and acceleration makes it ideal for engineers looking to streamline their workflow.\n \n- Startups and Tech Companies: Organizations that are building AI-driven products and services. Axolotl helps these companies accelerate their AI development process, reduce costs, and improve the performance of their AI models by leveraging advanced training techniques.\n \n- Open-Source Contributors: Developers and AI enthusiasts who are interested in contributing to an open-source AI training framework. Axolotl's open-source nature encourages collaboration and innovation within the community.\n \n- Academic Institutions: Universities and research institutions that require a robust and adaptable platform for teaching and research in AI and machine learning. Axolotl's flexibility makes it a valuable resource for educational purposes.\n \n\n## How to Use Axolotl:main-latest on Novita AI\n\n### Step 1: Access the GPU Instance Console\n\n- Go to the GPU menu.\n \n- Click Get Started to access the GPU Instance console.\n- enter image description here\n \n\n### Step 2: Choose a Template and GPU Type\n\n- Browse various official templates and GPU card options.\n \n- Select the Axolotl:main-latest template.\n \n- Click Deploy under the 4090 GPU card to proceed to the instance creation page.\n- enter image description here\n \n\n### Step 3: Adjust Disk and Configuration Parameters\n\n- In the Disk section, adjust the size of the system disk and local disk.\n \n- In the Configuration section, modify settings such as the image, startup commands, ports, and environment variables.\n \n- Check the box for Start Jupyter Notebook to launch Jupyter.\n- enter image description here\n \n\n### Step 4: Confirm Configuration and Deploy\n\n- Review the instance configuration and costs on the confirmation page.\n \n- Click Deploy to start the deployment process.\nenter image description here \n\n### Step 5: Wait for Deployment to Complete\n\n- Wait for the instance to finish deploying.\n enter image description here \n\n### Step 6: Manage and Monitor Instances\n\n- Once deployment is complete, the system will redirect you to the Instance Management page.\n \n- Locate your newly created instance, which will initially show a Pulling status (indicating the image is being downloaded).\n \n- Click the small arrow on the right side of the instance to view details.\n \n- Monitor the image pull progress. Once complete, the instance will transition to Running status.\n \n- Click Logs to view deployment logs.\n enter image description here\n\n### Step 7: Check Instance Logs\n\n- Go to the Instance Logs tab to check if the service is starting.\n \n- Wait for the service to finish initializing.\n enter image description here\n\n### Step 8: Connect to Jupyter Lab\n\n- Close the logs page.\n \n- Click Connect to open the connection information page.\n \n- Locate the Connection Options section and click Connect to Jupyter Lab to access the Jupyter interface.\n- enter image description here\n \n\n### Step 9: Access Jupyter Lab\n\n- Wait for the Jupyter Lab web interface to load.\n \n- Open Terminal to run an official example and verify the service is working correctly.\n enter image description here\n\n### Step 10: Run a Fine-Tuning Example\n\n- Execute the official example code to perform a fine-tuning task.\nbash\\n# Fetch axolotl examples\\naxolotl fetch examples\\n\\n# Or, specify a custom path\\naxolotl fetch examples --dest path/to/folder\\n\\n# Train a model using LoRA\\naxolotl train examples/llama-3/lora-1b.yml\\n\n## Further Ethical Considerations and Solutions\n\nWhen implementing Axolotl:main-latest, address these ethical guidelines:\n\nCombat Bias Proactively: Audit outputs regularly, use diverse training data, implement feedback mechanisms.\n\nPrioritize Data Protection: Use stringent handling protocols, anonymization, and privacy-preserving methods.\n\nPrevent Harmful Content: Establish moderation systems, develop usage policies, educate users on limitations.\n\nEnsure Transparency: Document model functions and decision-making processes clearly.\n\nImplement Ethical Governance: Incorporate ethical principles from initial deployment stages.\n\nBalance Innovation and Responsibility: Maintain equilibrium between AI advancement and ethical obligations.\n\n## FAQs\n\n### What types of AI model fine-tuning methods does Axolotl support?\n\nAxolotl supports various fine-tuning methods including fullfinetune, lora, qlora, relora, and gptq.\n\n \n\n### How can users customize configurations when fine-tuning with Axolotl?\n\nUsers can customize configurations through simple yaml files or command-line interface (CLI).\n\n \n\n### Does Axolotl support running in different hardware environments?\n\nYes, Axolotl supports running on single or multiple GPUs and can be easily run locally or in the cloud through Docker.\n\n \n\n### What dataset formats does Axolotl support?\n\nAxolotl supports various dataset formats including JSONL. It also supports using custom formats or bringing your own tokenized datasets.\n\n \n\n### What advanced features or integrations does Axolotl offer?\n\nAnswer: Axolotl integrates with xformer, flash attention, rope scaling, and multipacking, and supports multi-GPU training through FSDP or Deepspeed.\n\n \n\n \n\n### How to perform data preprocessing in Axolotl?\n\nData preprocessing can be done using the command python -m axolotl.cli.preprocess along with a yaml configuration file.\n\n \n\n### Does Axolotl support recording the training process and results?\n\nYes, Axolotl supports logging results and optional checkpoints to wandb or mlflow.\n\n \n\n## License\n\n**Apache License 2.0\n\n## View on Github\n\nSource Site: https://github.com/axolotl-ai-cloud/axolotl\n\n \n\n### Get in Touch\n\n- Email: iris@novita.ai\n \n- Discord: novita.ai\n \n\n \n\nAbout Novita** AI\n\nNovita AI is an AI cloud platform that offers developers an easy way to deploy AI models using our simple API, while also providing an affordable and reliable GPU cloud for building and scaling.

Other Recommended Templates

koboldcpp

Run GGUF models easily with a KoboldAI UI. One File. Zero Install.

FLUX.1-dev

Unleash Your Creativity with the FLUX.1 [dev] Text-to-Image Model

Gemma-2-2b-it

Optimized AI Performance with Gemma-2-2b-it on Novita AI

Facefusion v3.1.1

Seamlessly merge and enhance faces with Facefusion v2.6.0

Ready to build smarter? Start today.
Get started with Novita AI and unlock the power of affordable, reliable, and scalable AI inference for your applications.