Axolotl

Accelerate AI Training with Axolotl on Novita AI

Framework
axolotl ai cloud2025-03-20Github

One click deployment

On Demand
Deploy
gpu hot
README

What is Axolotl?

Axolotl is an AI model fine-tuning tool designed to streamline the fine-tuning process of various AI models, supporting multiple configurations and architectures.

Key Features and Benefits of Axolotl

Key Features

  • Different Models Support: Support for training various Huggingface models, including Llama, Pythia, Falcon, and MPT.

  • Fine-Tuning Methods: Support for diverse fine-tuning techniques such as FullFineTune, LoRA, QLoRA, ReLoRA, and GPTQ.

  • Customizable Configurations: Easily customize configurations using simple YAML files or CLI options.

  • Flexible Dataset Handling: Compability to Load different dataset formats, use custom formats, or bring your own tokenized datasets.

  • Enhanced Performance Integration: Integrate with xformer, Flash Attention, RoPE scaling, and MultiPacking for improved performance.

Benefits

  • Scalability: Axolotl allows for easy scaling of resources to accommodate varying workloads, making it suitable for both small and large-scale applications.

  • Cost Efficiency: By leveraging GPU resources in the cloud, users can optimize costs by only paying for what they use, avoiding the need for expensive on-premises hardware.

  • Performance: Axolotl is designed to maximize the performance of GPU resources, enabling faster processing times for data-intensive tasks such as machine learning and simulations.

  • Flexibility: Users can easily configure and customize their environments to meet specific needs, allowing for a wide range of applications from research to production.

Who will Use Axolotl?

  • AI Researchers and Data Scientists: Individuals who are focused on developing, fine-tuning, and experimenting with large-scale machine learning models. Axolotl provides them with the flexibility and tools needed to efficiently train and optimize these models.

  • Machine Learning Engineers: Professionals who need a scalable and efficient solution to train and deploy machine learning models. Axolotl's support for distributed training and acceleration makes it ideal for engineers looking to streamline their workflow.

  • Startups and Tech Companies: Organizations that are building AI-driven products and services. Axolotl helps these companies accelerate their AI development process, reduce costs, and improve the performance of their AI models by leveraging advanced training techniques.

  • Open-Source Contributors: Developers and AI enthusiasts who are interested in contributing to an open-source AI training framework. Axolotl's open-source nature encourages collaboration and innovation within the community.

  • Academic Institutions: Universities and research institutions that require a robust and adaptable platform for teaching and research in AI and machine learning. Axolotl's flexibility makes it a valuable resource for educational purposes.

How to Use Axolotl:main-latest on Novita AI

Step 1: Access the GPU Instance Console

  • Go to the GPU menu.

  • Click Get Started to access the GPU Instance console.

  • enter image description here

Step 2: Choose a Template and GPU Type

  • Browse various official templates and GPU card options.

  • Select the Axolotl:main-latest template.

  • Click Deploy under the 4090 GPU card to proceed to the instance creation page.

  • enter image description here

Step 3: Adjust Disk and Configuration Parameters

  • In the Disk section, adjust the size of the system disk and local disk.

  • In the Configuration section, modify settings such as the image, startup commands, ports, and environment variables.

  • Check the box for Start Jupyter Notebook to launch Jupyter.

  • enter image description here

Step 4: Confirm Configuration and Deploy

  • Review the instance configuration and costs on the confirmation page.

  • Click Deploy to start the deployment process. enter image description here

Step 5: Wait for Deployment to Complete

  • Wait for the instance to finish deploying. enter image description here

Step 6: Manage and Monitor Instances

  • Once deployment is complete, the system will redirect you to the Instance Management page.

  • Locate your newly created instance, which will initially show a Pulling status (indicating the image is being downloaded).

  • Click the small arrow on the right side of the instance to view details.

  • Monitor the image pull progress. Once complete, the instance will transition to Running status.

  • Click Logs to view deployment logs. enter image description here

Step 7: Check Instance Logs

  • Go to the Instance Logs tab to check if the service is starting.

  • Wait for the service to finish initializing. enter image description here

Step 8: Connect to Jupyter Lab

  • Close the logs page.

  • Click Connect to open the connection information page.

  • Locate the Connection Options section and click Connect to Jupyter Lab to access the Jupyter interface.

  • enter image description here

Step 9: Access Jupyter Lab

  • Wait for the Jupyter Lab web interface to load.

  • Open Terminal to run an official example and verify the service is working correctly. enter image description here

Step 10: Run a Fine-Tuning Example

  • Execute the official example code to perform a fine-tuning task.
1# Fetch axolotl examples 2axolotl fetch examples 3 4# Or, specify a custom path 5axolotl fetch examples --dest path/to/folder 6 7# Train a model using LoRA 8axolotl train examples/llama-3/lora-1b.yml

Further Ethical Considerations and Solutions

When implementing Axolotl:main-latest, address these ethical guidelines:

Combat Bias Proactively: Audit outputs regularly, use diverse training data, implement feedback mechanisms.

Prioritize Data Protection: Use stringent handling protocols, anonymization, and privacy-preserving methods.

Prevent Harmful Content: Establish moderation systems, develop usage policies, educate users on limitations.

Ensure Transparency: Document model functions and decision-making processes clearly.

Implement Ethical Governance: Incorporate ethical principles from initial deployment stages.

Balance Innovation and Responsibility: Maintain equilibrium between AI advancement and ethical obligations.

FAQs

What types of AI model fine-tuning methods does Axolotl support?

Axolotl supports various fine-tuning methods including fullfinetune, lora, qlora, relora, and gptq.

How can users customize configurations when fine-tuning with Axolotl?

Users can customize configurations through simple yaml files or command-line interface (CLI).

Does Axolotl support running in different hardware environments?

Yes, Axolotl supports running on single or multiple GPUs and can be easily run locally or in the cloud through Docker.

What dataset formats does Axolotl support?

Axolotl supports various dataset formats including JSONL. It also supports using custom formats or bringing your own tokenized datasets.

What advanced features or integrations does Axolotl offer?

Answer: Axolotl integrates with xformer, flash attention, rope scaling, and multipacking, and supports multi-GPU training through FSDP or Deepspeed.

How to perform data preprocessing in Axolotl?

Data preprocessing can be done using the command python -m axolotl.cli.preprocess along with a yaml configuration file.

Does Axolotl support recording the training process and results?

Yes, Axolotl supports logging results and optional checkpoints to wandb or mlflow.

License

Apache License 2.0

View on Github

Source Site: https://github.com/axolotl-ai-cloud/axolotl

Get in Touch

About Novita AI

Novita AI is an AI cloud platform that offers developers an easy way to deploy AI models using our simple API, while also providing an affordable and reliable GPU cloud for building and scaling.

Other Recommended Templates

MiniCPM-V-2_6

Empower Your Applications with MiniCPM-V 2.6 on Novita AI.

Kohya-SS

Unleash the Power of Kohya-SS with Novita AI

stable-diffusion-3-medium

Transform Creativity with Stable Diffusion 3 Medium on Novita AI

Qwen2-Audio-7B-Instruct

Empower Your Audio with Qwen2 on Novita AI

Llama3.1-8B

Run Llama3.1-8B with SGlang on Novita AI

Ready to build smarter? Start today.
Get started with Novita AI and unlock the power of affordable, reliable, and scalable AI inference for your applications.

Building an AI startup? Get up to $10,000 in credits!

Get up to $10,000 in credits and dedicated support to grow and scale your AI startup.

Apply Now