BYTEMAGMA

Master Rust Programming

Mastering Cargo: Features, Profiles, and More

Cargo is more than just a package manager—it’s the backbone of the Rust development workflow. From compiling code to managing dependencies, building workspaces, configuring profiles, and enabling feature flags, Cargo quietly empowers virtually every Rust project. This post dives deep into Cargo’s more advanced capabilities to help you truly master your build process.


Introduction

Rust’s cargo command-line tool does a lot of heavy lifting behind the scenes. Most Rust developers are familiar with basic commands like cargo build, cargo run, and cargo test. But Cargo offers a rich set of features that go far beyond the basics—including workspace management, custom build profiles, conditional compilation with features, and build script integration.

In this post, we’ll demystify these powerful Cargo capabilities. You’ll learn how to organize large codebases, build for different environments, optimize your compilation process, and configure your project for flexible deployment scenarios. Whether you’re working solo or across a team, mastering Cargo will improve your productivity and project maintainability.


Understanding Cargo Workspaces

As Rust projects evolve from quick scripts to multi-component systems, effective organization becomes essential. You might start with a single crate, but before long, you’ll want to extract reusable logic into a library, isolate test tools into a separate package, or structure your code for multiple binaries. That’s where Cargo workspaces come in.

A workspace is Cargo’s way of grouping related packages (or “crates”) under one umbrella. It offers a unified structure for managing dependencies, builds, and versioning across multiple crates—without duplicating configuration or compiling the same dependency multiple times. Workspaces are especially useful for monorepos, modular libraries, or projects that include both a binary and a supporting library.

By sharing a single Cargo.lock and target directory, workspaces save time, disk space, and mental overhead. They also make it easier to keep your codebase consistent and your build process fast—critical benefits when collaborating across teams or scaling your Rust code to production systems.

In this section, we’ll explore how workspaces are defined, how to set one up from scratch, and how to make the most of them in real-world scenarios.


What is a Workspace?

In Cargo, a workspace is a configuration that ties multiple crates together under a single root. Instead of being just a collection of folders, a workspace introduces shared build context—like a unified Cargo.lock, a single target/ directory, and coordinated commands like cargo build, cargo test, and cargo run.

Crates within a workspace retain their independence but benefit from centralized coordination. This means you can break your code into logical units—like libraries, binaries, and tools—without duplicating build logic or dependency declarations.

Let’s look at the anatomy of a workspace and how it sets the foundation for scalable Rust development.


Example: Basic Workspace Structure

Here’s what a minimal workspace might look like:

ferris_workspace/
├── Cargo.toml         # Workspace root (not a crate)
├── Cargo.lock
├── members/
│   ├── ferris_core/
│   │   └── Cargo.toml
│   └── ferris_app/
│       └── Cargo.toml

We typically create the top-level workspace directory manually, as Cargo does not have a command to create a workspace.

We also need to manually create the workspace Cargo.toml file, used to configure the workspace.


Workspace Root Cargo.toml

# ferris_workspace/Cargo.toml
[workspace]
members = [
    "members/ferris_core",
    "members/ferris_app"
]

This top-level Cargo.toml doesn’t define its own [package] section—because it isn’t a crate itself. It simply defines the workspace and its member crates.

In this workspace we have two member crates, ferris_core which will be a library crate, and ferris_app which will be a binary crate.


Library Crate: ferris_core

# ferris_workspace/members/ferris_core/Cargo.toml
[package]
name = "ferris_core"
version = "0.1.0"
edition = "2021"

[lib]
path = "src/lib.rs"
// ferris_workspace/members/ferris_core/lib.rs
pub fn greet(name: &str) -> String {
    format!("Hello, {name} from ferris_core!")
}

Our library crate is created using a Cargo command. We’ll see this in the next sub-section. When we create this library crate, a directory is automatically created for us. Also created for us in a crate Cargo.toml configuration file and a lib.rs file in which we add code for our library.


Binary Crate: ferris_app

# ferris_workspace/members/ferris_app/Cargo.toml
[package]
name = "ferris_app"
version = "0.1.0"
edition = "2021"

[dependencies]
ferris_core = { path = "../ferris_core" }
// ferris_workspace/members/ferris_app/src/main.rs
use ferris_core::greet;

fn main() {
    let message = greet("Greg");
    println!("{}", message);
}

/*
Output:
Hello, Greg from ferris_core!
*/

Our binary crate is created using a Cargo command. We’ll see this in the next sub-section. When we create this binary crate, a directory is automatically created for us. Also created for us in a crate Cargo.toml configuration file and a main.rs file in which we add code for our application.


Running the Workspace

From the root directory (ferris_workspace), you can run:

cargo build

This builds all workspace members, but does not run the program.

You can also build or run an individual crate. The run command builds and runs the program:

cargo run -p ferris_app

Or run all tests across the workspace:

cargo test

Summary

Cargo workspaces offer a clean way to manage multiple related crates in a single project. They allow:

  • Shared lockfiles and target directories (faster builds)
  • Easy project-wide commands (build, test, etc.)
  • Seamless cross-crate development via path dependencies

This setup is especially valuable when you’re working with modular code, a combination of libraries and binaries, or building a monorepo-style architecture.


Creating and Managing a Workspace

Once you understand what a Cargo workspace is and why it’s useful, the next step is learning how to create and manage one from scratch. Whether you’re starting a new project or refactoring an existing one, setting up a workspace is a straightforward process that unlocks powerful organization and build features.

In this section, we’ll walk through the setup of a workspace directory, creation of member crates, and how to structure and manage your workspace over time. By the end, you’ll have a solid foundation for building and scaling multi-crate Rust projects.


Step-by-Step: Creating a Workspace from Scratch

Let’s build a workspace named ferris_workspace with:

  • A core library crate called ferris_core
  • A binary application crate called ferris_app that depends on ferris_core

Step 1: Create the root directory manually

mkdir ferris_workspace
cd ferris_workspace

Step 2: Create a workspace-level Cargo.toml file manually

This file doesn’t define a crate—it defines the workspace itself.

[workspace]
members = []

Initially our members array will be empty.


Add a Library Crate to the Workspace

Execute the following command in the top-level workspace directory ferris_workspace:

cargo new ferris_core --lib

This command automatically creates a directory ferris_core/ with its own Cargo.toml:

[package]
name = "ferris_core"
version = "0.1.0"
edition = "2024"

[dependencies]

The command also automatically creates a lib.rs file for our library code. Replace the default content in lib.rs with the following:

pub fn get_status() -> &'static str {
    "All systems operational."
}

The workspace Cargo.toml file should automatically have the library crate added to its members array. If not then add it so your ferris_workspace/Cargo.toml file has the following contents:

[workspace]
members = [
  "ferris_core"
]

Add a Binary Crate that Uses the Library

Execute the following command in the top-level workspace directory ferris_workspace:

cargo new ferris_app

This command automatically creates a directory ferris_app/ with its own Cargo.toml:

[package]
name = "ferris_app"
version = "0.1.0"
edition = "2024"

[dependencies]

Add a dependency for our ferris_core library crate in this ferris_app/Cargo.toml file:

[dependencies]
ferris_core = { path = "../ferris_core" }

The command also automatically creates a main.rs file for our application code. Replace the default content in main.rs with the following:

use ferris_core::get_status;

fn main() {
    println!("App starting...");
    println!("{}", get_status());
}

/*
Output:
App starting...
All systems operational.
*/

Notice that we are calling the get_status() function from our library crate in our binary crate main() function.

The workspace Cargo.toml file should automatically have the binary crate added to its members array. If not then add it so your ferris_workspace/Cargo.toml file has the following contents:

[workspace]
members = [ 
  "ferris_app",
  "ferris_core"
]

Run the application by executing this in the workspace directory ferris_workspace:

cargo run

Expected output:

App starting...
All systems operational.

Managing the Workspace

Once set up, you can manage the workspace from the root directory:

  • Build all crates: cargo build
  • Test all crates: cargo test
  • Run a specific binary: cargo run -p ferris_app
  • Add a new member:
    • Create a new crate (e.g., cargo new ferris_utils)
    • Add it to the members array in the workspace root Cargo.toml

Summary

Creating and managing a workspace is mostly about setting up the right structure:

  • Define members in the root Cargo.toml
  • Keep each crate self-contained
  • Use local path dependencies for inter-crate communication

With this setup in place, you’re ready to scale your Rust projects efficiently. This post will also look at how you can fine-tune each crate’s capabilities using Cargo features for flexible configuration and conditional compilation.


Use Cases and Best Practices

Now that you’ve created a workspace and added crates to it, let’s talk about when using a workspace really shines—and how to structure one effectively. Workspaces are more than just a way to bundle code; they’re an essential tool for clean architecture, modular development, and scaling teams and projects.

In this subsection, we’ll walk through common use cases where workspaces excel, and then share best practices to keep your project maintainable as it grows. We’ll build on the ferris_workspace example and expand it to illustrate each scenario.


Use Case 1: Separate Binaries and Libraries

It’s common to have a core library that implements functionality, and one or more binaries that use it. This avoids code duplication and keeps your business logic testable and reusable.

We already did this in the previous section with:

  • ferris_core: reusable library crate
  • ferris_app: binary crate that calls ferris_core::get_status()

You can even add a second binary for internal tools:

cargo new ferris_diag

Ensure your workspace Cargo.toml has this new crate as a member:

[workspace]
members = [ 
  "ferris_app",
  "ferris_core", 
  "ferris_diag"
]

Let’s edit the new binary ferris_diag/Cargo.toml file to also use our ferris_core library crate:

[package]
name = "ferris_diag"
version = "0.1.0"
edition = "2024"

[dependencies]
ferris_core = { path = "../ferris_core" }

Also replace the contents of this new binary crate main.rs file with the following:

use ferris_core::get_status;

fn main() {
    println!("Running diagnostics...");
    println!("{}", get_status());
}

/*
Output:
Running diagnostics...
All systems operational.
*/

This design pattern helps you scale by splitting CLI tools, web servers, and test apps into separate binaries, while keeping the core logic shared.


Run the new binary by executing this in the workspace directory ferris_workspace:

cargo run -p ferris_diag

Expected output:

App starting...
All systems operational.

Note that because we now have two binary crates in the workspace, we can’t just execute cargo run, we need to specify which binary crate to run with --bin and the name of the binary crate to run.

So if we want to run our first binary crate we need to execute:

cargo run -p ferris_app

Use Case 2: Modular Libraries for Clean Separation

Workspaces are great for large libraries where concerns can be split into subcrates—think ferris_utils, ferris_math, ferris_network, etc.

Let’s add a hypothetical utility library crate:

cargo new ferris_utils --lib

Ensure your workspace Cargo.toml has this new crate as a member:

[workspace]
members = [ 
  "ferris_app",
  "ferris_core", 
  "ferris_diag", 
  "ferris_utils"
]

Let’s edit the ferris_core library crate Cargo.toml file to have a dependency for ferris_utils:

[package]
name = "ferris_core"
version = "0.1.0"
edition = "2024"

[dependencies]
ferris_utils = { path = "../ferris_utils" }

Now replace the contents of the ferris_utils/src/lib.rs file with the following:

pub fn uptime() -> u32 {
    42 // pretend uptime in seconds
}

Now we can edit our ferris_core/src/lib.rs to use the ferris_utils library:

use ferris_utils::uptime;

pub fn get_status() -> String {
    format!("Uptime: {} seconds", uptime())
}

Notice how we call ferris_utils library uptime() function inside our ferris_core library get_status() function.

Run the ferris_app binary by executing this in the workspace directory ferris_workspace:

cargo run -p ferris_app

Expected output:

App starting...
Uptime: 42 seconds

This modular approach keeps your crates small, focused, and testable.


Best Practices for Workspace Design

  • Group by concern, not by size: Create subcrates for logical boundaries, not arbitrary lines. Good examples: core, cli, network, utils.
  • Avoid deep interdependencies: Don’t create dependency cycles between crates. Use clearly layered design: e.g., utils → core → app.
  • Keep root focused on coordination: Don’t turn your root into an active crate unless absolutely necessary.
  • Use cargo run -p crate_name and cargo test -p crate_name for targeted builds and tests when scaling up.

Summary

Cargo workspaces shine when you’re organizing code into clean, purposeful components. By keeping logic separated but connected, you enable parallel development, faster testing, and cleaner builds.

In the next section, we’ll explore other Cargo features.


Leveraging Cargo Features

Rust’s powerful type system and strict compilation model give developers great control over performance and correctness—but sometimes you need flexibility. That’s where Cargo features come in.

Features let you define optional pieces of functionality in your crate. Consumers (or other crates in a workspace) can opt into specific features to enable or disable functionality, reduce compile times, toggle dependencies, or gate experimental APIs. This is especially useful in libraries that serve multiple use cases, and in workspace setups where different binaries may require different capabilities from shared code.

In this section, we’ll explore how to define and use features in real-world projects, with examples based on our existing ferris_workspace.


What Are Features?

Cargo features are named configuration flags that can enable or disable parts of your code at compile time. They are typically used to:

  • Conditionally compile code
  • Control optional dependencies
  • Offer different capabilities for different use cases (e.g. “cli”, “serde”, “nightly”)

Features are declared in Cargo.toml and then accessed via conditional compilation in Rust source files using this attribute:

#[cfg(feature = "...")]

Let’s add a feature to the ferris_core library crate to conditionally include logging behavior.

Example: Defining and Using a Feature in ferris_core

We’ll define a feature called log_status that enables printing to the console whenever the status is queried.

Update the ferris_core/Cargo.toml file to have this content:

[package]
name = "ferris_core"
version = "0.1.0"
edition = "2021"

[dependencies]
ferris_utils = { path = "../ferris_utils" }

[features]
default = []
log_status = []

This declares a feature called log_status, with no default features enabled.

The default = [] line in the [features] section explicitly defines the default set of features that will be enabled when none are specified. In our case, we’re making it empty, which means that log_status must be enabled explicitly. If we wanted the logging to be on by default, we could write:

default = ["log_status"]

Let’s also update the ferris_core/src/lib.rs library file:

use ferris_utils::uptime;

pub fn get_status() -> String {
    let message = format!("Uptime: {} seconds", uptime());

    #[cfg(feature = "log_status")]
    println!("LOG: Status checked");

    message
}

If the log_status feature is enabled, the library will print a log message when get_status() is called. If the log_status feature is not enabled, that code won’t even be compiled.


Example: Enable the Feature from ferris_app

Now, in the ferris_app crate, you can opt into the log_status feature.

Update ferris_app/Cargo.toml:

[package]
name = "ferris_app"
version = "0.1.0"
edition = "2021"

[dependencies]
ferris_core = { path = "../ferris_core", features = ["log_status"] }

We don’t need to change file ferris_app/src/main.rs.

Run the ferris_app binary by executing this in the workspace directory ferris_workspace:

cargo run -p ferris_app

Expected output:

App starting...
LOG: Status checked
Uptime: 42 seconds

If you remove the features = ["log_status"] line from the ferris_app/Cargo.toml file dependency declaration, the output will no longer include the log message:

/*
Output:
App starting...
Uptime: 42 seconds
*/

Go ahead and leave the log_status feature enabled in the ferris_app/Cargo.toml file.


Summary

Cargo features give you a clean way to enable or disable functionality without code duplication or branching logic.

They’re especially useful for:

  • Optional logging
  • Debug or performance toggles
  • Integration with optional dependencies (e.g. serde, rayon)
  • Supporting multiple binaries or environments from the same crate

Declaring and Using Features

After understanding what features are, the next step is learning how to declare them in your Cargo.toml and use them effectively in your Rust code. Features are more than just flags—they shape how your code is compiled and what dependencies are included.

This subsection shows how to:

  • Declare features in your crate
  • Use them to enable conditional logic
  • Activate features from within a workspace
  • Enable them manually via the command line

We’ll continue working with our ferris_core crate and log_status feature introduced earlier, then add a second feature to deepen the example.


Declaring Features in Cargo.toml

In ferris_core, we define features in the [features] section of Cargo.toml.

Features can be:

  • Standalone, toggling code within your crate
  • Composed, enabling other features or dependencies
  • Set as default, so they’re enabled automatically unless overridden

Example: Two Features, One as Default (ferris_core/Cargo.toml)

[features]
default = ["log_status"]
log_status = []
verbose = []
  • default: includes log_status, but not verbose.
  • log_status: enables a log message when status is checked.
  • verbose: optionally adds extra diagnostic output.

This setup means log_status is active unless the user disables default features.


Example: Conditional Logic in Code (ferris_core/src/lib.rs)

use ferris_utils::uptime;

pub fn get_status() -> String {
    #[cfg(feature = "log_status")]
    println!("LOG: Status checked");

    #[cfg(feature = "verbose")]
    println!("LOG: Uptime function about to be called");

    let message = format!("Uptime: {} seconds", uptime());

    #[cfg(feature = "verbose")]
    println!("LOG: Message created");

    message
}

Our function will always return message, but all three println! calls are conditionally compiled, depending on what features are enabled. Only the enabled features will include their code in the final binary.

Tip: Use cargo clean when changing features

When you modify which features are enabled for a crate (e.g., adding or removing default-features), Cargo may not immediately rebuild everything from scratch, especially in a workspace.

If you notice that old log messages are still appearing or feature-dependent code isn’t behaving as expected, it might be due to stale compiled artifacts.

Run this to force a clean rebuild:

cargo clean

This ensures Cargo fully re-evaluates which features are enabled and rebuilds the entire crate tree accordingly.


Example: Enabling Features from ferris_app

You can also specify which features of ferris_core to enable in ferris_app/Cargo.toml.

[dependencies]
ferris_core = { path = "../ferris_core", features = ["verbose"] }

Execute

cargo cleancargo run -p ferris_app

We’re running cargo clean just to ensure our changes to enabled features are applied.

App starting...
LOG: Status checked
LOG: Uptime function about to be called
LOG: Message created
Uptime: 42 seconds

log_status is enabled by default in ferris_core, and verbose is enabled in ferris_app.

Because log_status is in the default feature set, it’s also enabled unless you explicitly disable default features in ferris_app/Cargo.toml:

# Enable only verbose, skip default
ferris_core = { path = "../ferris_core", default-features = false, features = ["verbose"] }

Execute

cargo cleancargo run -p ferris_app
App starting...
LOG: Uptime function about to be called
LOG: Message created
Uptime: 42 seconds

Because we set default-features to false, we no longer see the message:

LOG: Status checked

Important: This change is only for demonstration.

To follow along with the rest of this post, you should restore the default behavior by reverting to:

ferris_core = { path = "../ferris_core", features = ["verbose"] }

This ensures both log_status and verbose remain enabled moving forward.


Example: Running with Features Manually

You can also enable features from the command line when running directly:

cargo run -p ferris_app --features verbose

Or to override default features:

cargo run -p ferris_app --no-default-features --features verbose

This gives you flexibility during development and testing without editing Cargo.toml.

Note: this command line won’t work if you have not re-exported features from the ferris_app Cargo.toml file:

cargo run -p ferris_app --features verbose

Re-Exporting Features from Dependencies

If you want to let users run commands like:

cargo run -p ferris_app --features verbose

You’ll need to declare that feature in ferris_app itself, even if the underlying behavior lives in ferris_core.

This is known as re-exporting a feature.

Update ferris_app/Cargo.toml:

[features]
verbose = ["ferris_core/verbose"]

[dependencies]
ferris_core = { path = "../ferris_core", default-features = false }

Now, when you run:

cargo run -p ferris_app --features verbose

It will correctly enable the verbose feature on ferris_core.

Note: you can only use –no-default-features to turn off ferris_app default features from the command line.

cargo run -p ferris_app --no-default-features --features verbose

You cannot use –no-default-features to turn off ferris_core default features. The only way to do that is in the ferris_app Cargo.toml file:

[dependencies]
ferris_core = { path = "../ferris_core", default-features = false }

That line specifically applies to the ferris_core features.


Summary

Developers can override features at the command line using --features and --no-default-features.

Use the [features] section to define named feature flags in your crate.

Use #[cfg(feature = "...")] to conditionally compile code.

Crates using your crate can opt into features via dependency declarations.


Optional Dependencies and Feature Flags

One of Cargo’s most powerful feature-related tools is the ability to tie features to optional dependencies. This lets you include extra functionality only when needed—minimizing compile time, dependency bloat, and binary size for consumers who don’t require certain features.

Instead of always compiling a dependency—even if it’s not used—you can mark it as optional and enable it only when a corresponding feature is turned on. This pattern is common in libraries that support optional integrations (e.g., serde, rayon, tokio) or offer extra tools (e.g., metrics, logging, diagnostics).

In this section, we’ll show how to mark dependencies as optional, how to link them to features, and how to conditionally compile code based on their presence.


Step-by-Step: Adding an Optional Dependency

We’ll add a new crate ferris_logger to simulate an optional logging facility. It will only be compiled and used when a logging feature is enabled.

Step 1: Create the Optional Dependency Crate

cargo new ferris_logger --lib

Add it to your workspace (ferris_workspace/Cargo.toml):

[workspace]
members = [
    "ferris_core",
    "ferris_app",
    "ferris_utils",
    "ferris_logger"
]

Now replace the default code in our new library crate ferris_logger/src/lib.rs with the following:

pub fn log(message: &str) {
    println!("[FerrisLogger] {message}");
}

Step 2: Declare it as Optional in ferris_core

Update ferris_core/Cargo.toml:

[dependencies]
ferris_utils = { path = "../ferris_utils" }
ferris_logger = { path = "../ferris_logger", optional = true }

[features]
default = []
log_status = ["ferris_logger"]

This:

  • Declares ferris_logger as an optional dependency
  • Links it to the log_status feature
  • Ensures it’s only compiled and used when log_status is enabled
  • We’ve removed the verbose feature

Let’s also remove the verbose feature from the ferris_app/Cargo.toml file:

[package]
name = "ferris_app"
version = "0.1.0"
edition = "2021"

[dependencies]
ferris_core = { path = "../ferris_core" }

Note that because log_status is no longer in the ferris_core default features, if you run the app with:

cargo cleancargo run -p ferris_app

You won’t see the log_status message: “LOG: Status checked”


Step 3: Use the Optional Dependency in ferris_core/src/lib.rs

use ferris_utils::uptime;

#[cfg(feature = "log_status")]
use ferris_logger::log;

pub fn get_status() -> String {
    #[cfg(feature = "log_status")]
    log("Status was checked");

    let message = format!("Uptime: {} seconds", uptime());

    message
}

If log_status is not enabled, ferris_logger won’t even be compiled or linked.

cargo cleancargo run -p ferris_app

Expected output:

App starting...
Uptime: 42 seconds

Step 4: Use It from ferris_app

Update ferris_app/Cargo.toml:

[features]
log_status = ["ferris_core/log_status"]

[dependencies]
ferris_core = { path = "../ferris_core", default-features = false }

ferris_app/src/main.rs

use ferris_core::get_status;

fn main() {
    println!("App starting...");
    println!("{}", get_status());
}

/*
Output with feature enabled:
App starting...
[FerrisLogger] Status was checked
Uptime: 42 seconds
*/

Step 5: Run It with Logging Enabled

cargo run -p ferris_app --features log_status

Output:

App starting...
[FerrisLogger] Status was checked
Uptime: 42 seconds

Without that feature, ferris_logger is never compiled or included:

cargo run -p ferris_app

Output:

App starting...
Uptime: 42 seconds

Summary

Optional dependencies tied to features let you:

  • Reduce compile time and binary size
  • Offer “plug-in” capabilities like logging or serialization
  • Let users opt in only to what they need

To make this work:

  • Consumers (or crates like ferris_app) opt in via features = [...] or re-exporting
  • Mark the dependency as optional = true
  • Tie it to a feature in [features]
  • Use #[cfg(feature = "...")] to guard the code

Best Practices for Feature Design

Cargo features are a powerful tool, but without thoughtful design, they can quickly lead to messy configurations, brittle builds, or confusing user experiences. Good feature design is about more than just toggling code — it’s about building flexible, predictable, and scalable APIs for consumers of your crate or workspace.

In this section, we’ll look at several best practices for designing features that work well in real-world projects. We’ll also highlight a few pitfalls to avoid as your project grows.

Note: The following best practices and examples are meant to illustrate common patterns in feature design. You don’t need to modify your current ferris_workspace setup — these are here to help you apply good design principles in future projects.


1. Use Additive Features

Features should ideally enable additional functionality, not remove or radically change behavior. This makes them predictable, avoids surprising behavior, and ensures compatibility between feature sets.

Example: Additive vs Non-Additive

Good (additive behavior):

[features]
default = []
logging = ["ferris_logger"]
#[cfg(feature = "logging")]
ferris_logger::log("Extra log info");

🚫 Bad (non-additive/contradictory):

#[cfg(feature = "no_logging")

Additive features compose more easily, especially in workspaces with multiple binaries or libraries.


2. Avoid Feature Combinations That Conflict

Cargo features are unified across all dependents, so if one crate enables feature-a and another enables feature-b, your crate is compiled with both.

Design your features to work independently or compose safely together.

Example: Two additive features

#[cfg(feature = "metrics")]
pub fn record_metric(name: &str) { /* ... */ }

#[cfg(feature = "logging")]
pub fn log_message(msg: &str) { /* ... */ }

#[cfg(all(feature = "metrics", feature = "logging"))]
pub fn log_and_record(name: &str) {
    log_message("Recording metric");
    record_metric(name);
}

This lets you build up behavior instead of requiring mutually exclusive combinations.


3. Keep Defaults Minimal and Safe

Avoid enabling optional dependencies or advanced functionality by default. Let the user opt into complexity.

In your ferris_core example, you correctly set:

[features]
default = []
log_status = ["ferris_logger"]

This ensures consumers won’t get logging unless they explicitly enable it.

If you do use defaults, document them clearly and only include features that are broadly useful and safe.


4. Use Feature Aliases in Binary Crates

When building a binary like ferris_app that depends on feature-rich libraries, you can re-export feature flags so users only have to toggle features in one place.

[features]
log_status = ["ferris_core/log_status"]

[dependencies]
ferris_core = { path = "../ferris_core", default-features = false }

Now users can simply run:

cargo run -p ferris_app --features log_status

…without needing to know the internal feature structure of ferris_core.


5. Group Related Features into Higher-Level Modes

For larger projects, define meta-features that activate a bundle of related capabilities.

[features]
full = ["logging", "metrics", "tracing"]

Then users can do:

cargo build --features full

This simplifies usage and ensures consistent setups.


6. Use cfg(feature = "...") (Not cfg!) to Control Compilation

Use #[cfg(feature = "...")] for compile-time control, not cfg!(), which runs at runtime and always compiles the code.

Good (compile-time gated):

#[cfg(feature = "logging")]
log("Started up");

Not ideal:

if cfg!(feature = "logging") {
    // code always compiled
    log("Started up");
}

This affects performance and binary size — especially with optional dependencies.


Summary

Well-designed Cargo features make your crate easier to use, test, and scale. Stick to these best practices:

  • Make features additive, not subtractive
  • Avoid conflicting combinations
  • Keep default features minimal
  • Re-export features from binaries for convenience
  • Group features when appropriate
  • Use #[cfg(feature = "...")] to gate code cleanly

Customizing Build Profiles

Cargo provides a set of profiles that let you control how your code is compiled in different situations. These profiles adjust compiler settings such as optimization level, debug information, overflow checks, panic behavior, and more — allowing you to tune for either development speed or runtime performance.

Rust comes with two built-in profiles — dev and release — and also lets you define custom profiles for use cases like benchmarking, fuzzing, or internal staging builds. In this section, we’ll explore what these profiles are, how to customize them, and when to use each one.

Fuzzing (short for fuzz testing) is an automated software testing technique that feeds a program large amounts of random, invalid, or unexpected data to find bugs, crashes, panics, or security vulnerabilities.

The goal is to expose edge cases or inputs that your regular unit or integration tests might miss — especially:

  • Memory safety issues (e.g., use-after-free, buffer overflows)
  • Panics due to unexpected input
  • Logic bugs triggered by unusual data combinations

Built-in Profiles: dev and release

Cargo automatically uses different compilation profiles depending on the command you run. The two most common are:

  • dev: used when you run cargo build or cargo run (without --release)
  • release: used when you run cargo build --release or cargo run --release

These profiles differ in how they balance speed of compilation versus performance of the resulting binary.

Here’s a quick overview:

ProfileOptimizationDebug SymbolsOverflow ChecksTypical Use
devMinimal (opt-level = 0)YesYesDevelopment, iteration
releaseAggressive (opt-level = 3)OptionalNoProduction, benchmarks

You can customize either profile in your Cargo.toml.

Debug symbols are metadata embedded in your compiled binary that map the machine code back to your original source code — things like:

  • Function names
  • Variable names and types
  • Source file names and line numbers
  • Stack frames and call traces

They’re what allow tools like gdb, lldb, and rust-lldb to show meaningful output when you’re debugging, like:

thread 'main' panicked at 'index out of bounds: the len is 5 but the index is 10', src/main.rs:12:5

Without debug symbols, that message might just say:

Segmentation fault at 0x0057fa2

Or the backtrace would be unreadable.


Example: Measure Debug vs Release Performance

Let’s create a small benchmark-style example for our ferris_workspace to see the difference between dev and release. Create a new binary crate:

cargo new ferris_bench

Then replace the default content in ferris_bench/src/main.rs with the following:

fn compute_heavy(n: u64) -> u64 {
    (0..n).map(|i| i.wrapping_mul(2)).sum()
}

fn main() {
    let n = 50_000_000;
    let now = std::time::Instant::now();
    let result = compute_heavy(n);
    let elapsed = now.elapsed();
    println!("Result: {result}");
    println!("Elapsed: {:.2?}", elapsed);
}

/*
Output (in dev profile):
Result: 1800000000000000
Elapsed: 700ms - 1.2s (approx)

Output (in release profile):
Result: 1800000000000000
Elapsed: 50ms - 120ms (approx)
*/

Update the root Cargo.toml:

[workspace]
members = [
    "ferris_core",
    "ferris_app",
    "ferris_utils",
    "ferris_logger",
    "ferris_bench"
]

Now run it with the dev and release profiles:

cargo run -p ferris_bench      # dev profile
cargo run -p ferris_bench --release  # release profile

dev profile
Elapsed: 810.74ms

release profile
Elapsed: 86.00ns

The performance difference is significant — and this is why production builds should always use --release.


Example 2: Customizing Profiles in Cargo.toml

You can tune each profile’s behavior by adding a [profile.*] section to any crate’s Cargo.toml.

Add to ferris_bench/Cargo.toml:

[profile.dev]
opt-level = 1
debug = true

[profile.release]
opt-level = "z"
lto = true

This configuration:

  • Slightly optimizes dev builds to improve benchmarking without slowing compile time too much
  • Makes release build as small as possible (opt-level = "z") and enables link-time optimization (LTO)

Now you can see how changes in the profile affect binary size and performance.


Summary

Rust’s built-in dev and release profiles let you optimize your workflow for speed or runtime efficiency. You can:

  • Use cargo build and cargo run for fast dev iterations
  • Use --release for production and performance benchmarking
  • Customize profile behavior in each crate’s Cargo.toml

Creating Custom Profiles

While Cargo’s built-in dev and release profiles cover most use cases, you can define custom profiles for more specific needs—like staging, debugging with mild optimizations, fuzzing, or performance benchmarking without full release overhead.

Custom profiles let you tune how your code is compiled, with complete control over optimization levels, debug info, panic behavior, and more—without modifying your main dev or release builds.

As of Rust 1.67, custom profiles are available via profile.dev.package overrides, and as of Cargo 1.70 (stable), full custom profile names can be defined and used with --profile custom.

Let’s walk through how to set up and use one.

Example: Creating a “benchfast” Profile for Lightweight Benchmarking

In our workspace Cargo.toml file, let’s create a custom profile called benchfast that will:

  • Be faster to compile than full release
  • Have some optimizations (opt-level = 2)
  • Retain debug symbols for profiling
  • Enable LTO (Link-Time Optimization)

Step 1: Add the Custom Profile to ferris_workspace/Cargo.toml

[profile.benchfast]
inherits = "release"
opt-level = 2
debug = true
lto = true

The inherits = "release" line means this profile starts with the same config as release, then overrides specific values.


Step 2: Use the Custom Profile

Run your binary using the new profile:

cargo run -p ferris_bench --profile benchfast
Output: 
Result: 2499999950000000
Elapsed: 78.00ns

This gives you:

  • Faster builds than full release
  • Still-optimized code
  • Debug symbols for profiling

Where to Define Custom Profiles in Rust Projects

Standalone Crate:

If your project is a single crate (not in a workspace), define custom profiles like [profile.benchfast] directly in your crate’s Cargo.toml.

[profile.benchfast]
inherits = "release"
opt-level = 2
debug = true
lto = true

Workspace (especially virtual workspaces):

If you’re using a workspace with a root Cargo.toml containing [workspace], define all custom profiles in the root Cargo.toml, not in the individual member crates.

[workspace]
members = [
    "myapp",
    "mylib",
    "bench"
]

[profile.benchfast]
inherits = "release"
opt-level = 2
debug = true
lto = true

Defining custom profiles in the wrong place will result in errors like profile 'benchfast' is not defined or warnings about ignored profiles.

Summary:

  • Single crate: Put custom profiles in that crate’s Cargo.toml.
  • Workspace: Put them in the workspace root Cargo.toml only.

Example: Compare Sizes Across Profiles

Replace the content in ferris_bench/src/main.rs with the following:

fn main() {
    println!("Benchmarking binary profile");
}

Then build with each profile:

cargo build -p ferris_bench               # dev
cargo build -p ferris_bench --release     # release
cargo build -p ferris_bench --profile benchfast

Compare file sizes:

ls -lh target/debug/ferris_bench
ls -lh target/release/ferris_bench
ls -lh target/benchfast/ferris_bench

You’ll see a size/performance tradeoff between each profile.

ls -lh target/debug/ferris_bench: 425K 
ls -lh target/release/ferris_bench: 390K 
ls -lh target/benchfast/ferris_bench: 364K

Optional: Profile Overriding for Specific Dependencies

You can also set different profiles for dependencies.

For example, in:

ferris_workspace/Cargo.toml

ferris_bench/Cargo.toml if this were a standalone crate and not a workspace

[profile.dev.package.ferris_utils]
opt-level = 3

This tells Cargo: “When building ferris_utils in a dev build, compile it with high optimization.”

This is useful when:

  • You’re using a performance-sensitive utility or math crate
  • You want to test performance in dev builds without using --release

Summary

Custom profiles give you:

  • More flexibility for nuanced use cases like profiling or fast internal builds
  • A way to create purpose-specific configurations without modifying core settings
  • The ability to tune individual crates differently from the rest of the project

Try using custom profiles when working with performance-critical code or benchmarking tools like criterion, fuzzing frameworks, or embedded builds.


Optimizations and Debug Info

Rust’s build profiles give you precise control over how the compiler optimizes your code and what debug information gets included. By adjusting opt-level, debug, and related profile keys, you can fine-tune the trade-off between compile time, runtime performance, binary size, and debuggability.

In this section, we’ll explain what each key does, show common combinations for real-world use cases, and demonstrate how they affect the behavior of your application.


Profile Keys Overview

KeyDescription
opt-levelLevel of compiler optimizations (0–3, “s”, or “z”)
debugEnables debug symbols (true, false, or a level 0–2)
ltoEnables link-time optimization (true, false, “thin”)
panicSets panic strategy (unwind or abort)
overflow-checksEnables runtime integer overflow checks

Example: Debug-Friendly Optimization for Development

Sometimes you want faster runtime performance while keeping debug info for profiling and stack traces.

Update your root ferris_workspace/Cargo.toml:

[profile.dev]
opt-level = 1    # Slight optimization (safe for dev)
debug = true     # Keep debug symbols

This builds quickly but runs much faster than the default opt-level = 0.

Now replace the contents of ferris_bench/src/main.rs with the following:

fn main() {
    let sum: u64 = (0u64..10_000_000).map(|x| x.wrapping_mul(2)).sum();
    println!("Sum: {}", sum);
}

/*
Output:
Sum: 99999990000000
*/

Now run it:

cargo run -p ferris_bench

You’ll notice this version runs significantly faster than the fully unoptimized default.


Example: Tiny, Fast Release Builds with Debug Info

Let’s build a release profile that’s optimized for minimal binary size, but still has some debug info for crash reporting.

Update root ferris_workspace/Cargo.toml:

[profile.release]
opt-level = "z"    # Optimize for size
debug = 1          # Include line number info
lto = true         # Link-time optimization

Build with:

cargo build -p ferris_bench --release

This produces a small, fast binary with enough symbol data for backtraces.


Example: Custom Profile for Benchmarking

You might want to benchmark code with real-world optimizations but keep debug info for profiling tools.

In the root ferris_workspace/Cargo.toml:

[profile.benchfast]
inherits = "release"
opt-level = 2
debug = true
lto = "thin"

Now build and run:

cargo build -p ferris_bench --profile benchfast
./target/benchfast/ferris_bench

This gives you the best of both worlds — fast enough to test realistically, but debuggable.


Summary

Tuning opt-level and debug helps you:

  • Iterate faster in development
  • Ship smaller, faster binaries
  • Collect meaningful debug information when needed
  • Benchmark code in near-release conditions

Tip: Try different combinations to see how they impact runtime performance, build time, and binary size for your own projects.


Using Build Scripts Effectively

Sometimes your Rust project needs to do more than just compile source code. You might want to embed metadata (like Git commit hashes), compile C or C++ code, detect system libraries, or generate Rust code before compilation starts.

That’s where build scripts come in.

A build script is a special Rust file named build.rs, placed at the root of your crate. Cargo automatically runs it before compiling the rest of your crate. The script can perform custom tasks and communicate with Cargo using environment variables and print statements.

In this section, we’ll explore what a build script is, how it works, and how to use it effectively in real-world scenarios.


What is a Build Script (build.rs)?

A build script is an auxiliary Rust program that Cargo compiles and runs before it builds the main crate. It lives in your crate root as build.rs.

The purpose of a build script is to perform setup work and pass information to the compiler or Cargo using special print directives like:

  • cargo:rustc-env=KEY=VALUE — sets an environment variable for use in your code
  • cargo:rerun-if-changed=path — tells Cargo when to re-run the script
  • cargo:rustc-link-lib=name — links a native library
  • cargo:warning=msg — emits a compiler warning during the build

Let’s look at several examples. Some are just to illustrate aspects of build scripts and are not designed for you to actually perform, others are easy for you to give a try.

If you want to perform some of the examples, run this command to create a new binary crate and add it to the workspace:

cargo new ferris_build_demo

Now ensure the workspace Cargo.toml has the new crate:

[workspace]
members = [ 
  "ferris_app", 
  "ferris_bench", 
  "ferris_build_demo",
  "ferris_core", 
  "ferris_diag", 
  "ferris_logger", 
  "ferris_utils"
]

Example: Embedding a Git Commit Hash

Note, you might have difficulty following along with this example due to git issues. Every time you execute cargo new NAME_OF_CRATE Cargo initializes a new git repository. When in a standalone crate this may not be an issue, but in our ferris_workspace it can cause git issue, so you might need to just use this example as an illustration.

Let’s embed the current Git commit hash into your binary.

Go ahead and manually create a build.rs file at the crate root.

ferris_build_demo/
├── build.rs ← create this file
├── Cargo.toml
└── src/
└── main.rs

Now add this code to the build.rs file:

use std::process::Command;

fn main() {
    let output = Command::new("git")
        .args(&["rev-parse", "--short", "HEAD"])
        .output()
        .expect("failed to run git");

    let hash = String::from_utf8_lossy(&output.stdout);
    println!("cargo:rustc-env=GIT_HASH={}", hash.trim());
}

This sets the GIT_HASH environment variable at compile time.

Now replace the default code in ferris_build_demo/src/main.rs with the following:

fn main() {
    let git_hash = env!("GIT_HASH");
    println!("App built from Git commit: {}", git_hash);
}

/*
Output:
App built from Git commit: a1b2c3d
*/

Now build or run the program:

cargo build -p ferris_build_demo

cargo run -p ferris_build_demo

If you change your Git commit, Cargo won’t re-run the build script unless you explicitly tell it to:

Add this to build.rs to watch .git/HEAD:

println!("cargo:rerun-if-changed=.git/HEAD");

Summary

A build script is a powerful way to:

  • Embed dynamic or system-level data at build time
  • Generate code, metadata, or conditional behavior
  • Control when Cargo should re-run certain build logic

Common Use Cases

Build scripts are most useful when your project needs to perform tasks that Rust itself doesn’t directly support—like interacting with the system, compiling foreign code, or injecting dynamic metadata at compile time.

In this section, we’ll explore several common and valuable use cases where build.rs enhances your build process.


Example: Embedding Build Time

Let’s embed the current UTC build timestamp in your binary so the compiled app can print when it was built.

Add the following build dependency to ferris_build_demo/Cargo.toml for chrono:

[build-dependencies]
chrono = "0.4"

Replace the contents of build.rs with the following:

use chrono::Utc;

fn main() {
    let timestamp = Utc::now().to_rfc3339();
    println!("cargo:rustc-env=BUILD_TIMESTAMP={}", timestamp);
}

Replace the contents of ferris_build_demo/src/main.rs with the following:

fn main() {
    let build_time = env!("BUILD_TIMESTAMP");
    println!("This binary was built at: {}", build_time);
}

/*
Output:
This binary was built at: 2025-04-21T18:44:37+00:00
*/

Now run the program:

cargo run -p ferris_build_demo

Example: Detecting the Target Platform

You can pass platform-specific info into your code by reading Cargo-provided environment variables and setting your own.

Replace the contents of build.rs with the following:

fn main() {
    let target = std::env::var("TARGET").unwrap_or_default();
    println!("cargo:rustc-env=TARGET_TRIPLE={}", target);
}

Replace the contents of ferris_build_demo/src/main.rs with the following:

fn main() {
    let target = env!("TARGET_TRIPLE");
    println!("This binary was built for: {}", target);
}

/*
Output (example):
This binary was built for: x86_64-apple-darwin
*/

Now run the program:

cargo run -p ferris_build_demo

This can help you write platform-specific behavior or diagnostics without needing cfg(...) in your main code.


Example: Link a C Library (native code, so you’ll need to have a C environment)

Build scripts can also compile and link native C/C++ code using cc or cmake crates.

Replace the contents of build.rs with the following:

fn main() {
    cc::Build::new()
        .file("native/hello.c")
        .compile("hello");
}

Add the cc crate to the build-dependencies:

[build-dependencies]
cc = "1.0"

Create a file ferris_build_demo/native/hello.c and add this code:

#include <stdio.h>

void hello() {
    printf("Hello from C!\n");
}

Replace the contents of ferris_build_demo/src/main.rs with the following:

unsafe extern "C" {
    fn hello();
}

fn main() {
    unsafe { hello(); }
}

/*
Output:
Hello from C!
*/

Now run the program:

cargo run -p ferris_build_demo

What the cc Crate Does

The cc crate provides a build-time interface for compiling C/C++ source files as part of your crate’s build process. It’s typically used from build.rs.

It handles:

  • Compiling .c or .cpp files into object code
  • Linking those compiled objects into your Rust crate
  • Automatically setting platform-appropriate compiler flags
  • Triggering recompilation if the C source changes

Summary

Some of the most common use cases for build.rs include:

  • Embedding metadata (Git hash, build time, version info)
  • Detecting build target or environment
  • Linking C/C++ code using cc or cmake
  • Generating code or assets at build time
  • Emitting compile-time warnings

Communicating Between Build Script and Code

Build scripts don’t modify your code directly — instead, they emit instructions to Cargo, which then makes information available to your Rust crate at compile time. The most common way to send data from a build script to your code is by setting compile-time environment variables using:

println!("cargo:rustc-env=KEY=VALUE");

Your Rust code can then read these values using env!() or option_env!().

In this section, we’ll explore several patterns for safely and effectively communicating from build.rs to your main crate.


Example: Accessing Values with env!

The env! macro lets you embed build-time environment variables into your code at compile time. If the variable isn’t set, the build will fail.

Replace the contents of build.rs with the following:

fn main() {
    println!("cargo:rustc-env=APP_MODE=production");
}

Replace the contents of ferris_build_demo/src/main.rs with the following:

fn main() {
    let mode = env!("APP_MODE");
    println!("Running in mode: {}", mode);
}

/*
Output:
Running in mode: production
*/

Now run the program:

cargo run -p ferris_build_demo

Use env! when the variable is always guaranteed to be set by build.rs.


Example: Accessing Optional Values with option_env!

If a value may or may not be set, use option_env! to avoid a compile error.

Replace the contents of build.rs with the following:

fn main() {
    // Only set the variable conditionally
    if cfg!(debug_assertions) {
        println!("cargo:rustc-env=DEBUG_MODE=true");
    }
}

Replace the contents of ferris_build_demo/src/main.rs with the following:

fn main() {
    match option_env!("DEBUG_MODE") {
        Some("true") => println!("Debug mode is enabled"),
        _ => println!("Release or unknown mode"),
    }
}

/*
Output (dev):
Debug mode is enabled

Output (release):
Release or unknown mode
*/

Use option_env! for conditional, fallible, or optional configuration.


Example: Communicating Through Files

Sometimes a build script needs to pass structured or dynamically generated data. In this case, it’s common to generate a Rust source file and include!() it.

Replace the contents of build.rs with the following:

use std::fs;

fn main() {
    let config = r#"pub const MAX_USERS: u32 = 1000;"#;
    fs::write("src/generated_config.rs", config).unwrap();
}

Replace the contents of ferris_build_demo/src/main.rs with the following:

include!("generated_config.rs");

fn main() {
    println!("Max users allowed: {}", MAX_USERS);
}

/*
Output:
Max users allowed: 1000
*/

Use this method when you want to pass structured constants or code from build time into your program.


Summary

Rust build scripts can pass values to your crate in three main ways:

  • env!() — strict, compile-time required values
  • option_env!() — optional or conditional values
  • include!() — dynamically generated source code (use with care)

Conditional Compilation with Cargo

Sometimes your Rust code needs to behave differently depending on the target platform, build profile, enabled features, or other compile-time conditions. Rust offers powerful support for this through conditional compilation, allowing you to include or exclude blocks of code using configuration flags.

This is especially useful for:

  • Platform-specific behavior (e.g., windows vs unix)
  • Enabling optional features with [features] and --features
  • Debug-only instrumentation or assertions
  • Experimental or version-specific implementations

In this section, we’ll explore how to use the cfg and cfg_attr attributes to make your Rust code conditional in a clean, maintainable way.


cfg and cfg_attr

The cfg and cfg_attr attributes let you compile code only when specific conditions are met. These conditions can include target architecture, OS, feature flags, debug/release mode, and more.

  • #[cfg(condition)]: includes code only if the condition is true
  • #[cfg_attr(condition, attr)]: conditionally adds another attribute

Let’s look at both in action.


Example: Platform-Specific Code with cfg

Replace the contents of ferris_build_demo/src/main.rs with the following:

fn main() {
    platform_greeting();
}

#[cfg(target_os = "linux")]
fn platform_greeting() {
    println!("Hello from Linux!");
}

#[cfg(target_os = "windows")]
fn platform_greeting() {
    println!("Hello from Windows!");
}

#[cfg(target_os = "macos")]
fn platform_greeting() {
    println!("Hello from macOS!");
}

/*
Output (on Linux):
Hello from Linux!

Output (on Windows):
Hello from Windows!
Output (on macOS):
Hello from macOS!*/

Each #[cfg(...)] annotation tells the compiler: Only include this item if the condition matches.


Example: Using Feature Flags with cfg

Let’s assume we’re toggling a fancy_output feature to switch between basic and advanced UI.

Add the following to ferris_build_demo/Cargo.toml:

[features]
default = []
fancy_output = []

Replace the contents of ferris_build_demo/src/main.rs with the following:

fn main() {
    print_banner();
}

#[cfg(feature = "fancy_output")]
fn print_banner() {
    println!("Welcome to FerrisApp Pro");
}

#[cfg(not(feature = "fancy_output"))]
fn print_banner() {
    println!("Welcome to FerrisApp");
}

/*
Output (default):
Welcome to FerrisApp

Output (--features fancy_output):
Welcome to FerrisApp Pro
*/

Run with the feature enabled:

cargo run --features fancy_output

Example: Conditionally Applying an Attribute with cfg_attr

Sometimes you want to apply an attribute like #[derive(Debug)] only when a condition is met. That’s what cfg_attr is for.

Replace the contents of ferris_build_demo/src/main.rs with the following:

#[cfg_attr(debug_assertions, derive(Debug))]
struct Session {
    id: u32,
}

fn main() {
    let session = Session { id: 42 };

    #[cfg(debug_assertions)]
    println!("Struct can be printed with {:?}: {:?}", stringify!(Session), session);

    #[cfg(not(debug_assertions))]
    println!("Debug trait may not be available");
}

/*
Output (debug build):
Struct can be printed with Session: Session { id: 42 }

Output (release build):
Debug trait may not be available
*/

Now run the program:

cargo run -p ferris_build_demo
cargo run -p ferris_build_demo --release

This tells Rust: Apply the derive(Debug) attribute only in debug builds (where debug_assertions is true).


Summary

Rust’s conditional compilation system is expressive and robust. Use:

  • #[cfg(...)] to include/exclude code based on compile-time conditions
  • #[cfg_attr(..., ...)] to conditionally apply attributes like derive, inline, or test
  • Feature flags and --features to toggle optional functionality

Tying Features to Conditional Compilation

Cargo features allow you to opt into additional functionality at compile time. Combined with #[cfg(feature = "...")], you can conditionally include or exclude blocks of Rust code based on which features are enabled in the crate or passed via --features.

This is ideal for:

  • Adding optional dependencies or behaviors
  • Switching between lightweight and full-featured builds
  • Letting consumers of your library pick which components to enable

In this section, we’ll show how to define features in Cargo.toml and use them with cfg to gate functionality cleanly and idiomatically.


Example: Conditionally Compile a Function

Let’s conditionally compile a debug_banner() function based on a Cargo feature.

Create a new binary crate:

cargo new ferris_banner_demo

Add this to ferris_banner_demo/Cargo.toml:

[features]
default = []
debug_banner = []

Replace ferris_banner_demo/src/main.rs with the following:

fn main() {
    println!("Launching app...");
    
    #[cfg(feature = "debug_banner")]
    debug_banner();
}

#[cfg(feature = "debug_banner")]
fn debug_banner() {
    println!("[DEBUG] This is a debug banner.");
}

/*
Output (no feature):
Launching app...

Output (with feature):
Launching app...
[DEBUG] This is a debug banner.
*/

Build with:

cargo run -p ferris_banner_demo --features debug_banner
Launching app...
[DEBUG] This is a debug banner.

This keeps the function out of the binary entirely unless the feature is enabled.


Example: Toggling Implementation with cfg

Features can also drive which implementation of a function gets compiled.

main.rs

fn main() {
    render_ui();
}

fn render_ui() {
    render();
}

#[cfg(feature = "fancy_ui")]
fn render() {
    println!("Rendering fancy UI with themes and animations...");
}

#[cfg(not(feature = "fancy_ui"))]
fn render() {
    println!("Rendering basic UI.");
}

/*
Output (no feature):
Rendering basic UI.

Output (with feature):
Rendering fancy UI with themes and animations...
*/

Cargo.toml

[features]
fancy_ui = []

Build:

cargo run -p ferris_banner_demo
Rendering basic UI.

cargo run -p ferris_banner_demo --features fancy_ui
Rendering fancy UI with themes and animations...

Tip: Use Conditional Dependencies with Features

If a feature enables external functionality, tie it to a dependency:

Cargo.toml

[package]
name = "ferris_banner_demo"
version = "0.1.0"
edition = "2021"

[features]
with_logging = ["log", "env_logger"]

[dependencies]
log = { version = "0.4", optional = true }
env_logger = { version = "0.10", optional = true }

Then conditionally use it in your code – main.rs:

#[cfg(feature = "with_logging")]
use log::info;

fn main() {
    #[cfg(feature = "with_logging")]
    {
        env_logger::init();
        info!("App started with logging");
    }

    println!("App running...");
}

/*
Output (default, no features):
App running...

Output (with --features with_logging, no RUST_LOG set):
App running...  // log is suppressed by default

Output (with --features with_logging and RUST_LOG=info):
App started with logging
App running...
*/

How to Run It

From your workspace root, run:

RUST_LOG=info cargo run -p ferris_banner_demo --features with_logging
[2025-04-22T12:33:36Z INFO  ferris_banner_demo] App started with logging
App running...

If you omit RUST_LOG=info, log messages like info!() will be suppressed.

This ensures the log crate is only compiled and linked if with_logging is enabled.


Summary

  • Define optional features in Cargo.toml using [features]
  • Use #[cfg(feature = "...")] to include or exclude code based on enabled features
  • Use #[cfg(not(feature = "..."))] for alternative code paths
  • Combine features with optional dependencies for lightweight, customizable builds

Target-specific Compilation

Rust allows you to write code that compiles and runs only on specific platforms — like Linux, Windows, macOS, or embedded environments — using target-specific cfg conditions.

This is useful for:

  • Platform-specific behavior (e.g., path separators, system calls)
  • Architecture-specific optimizations (e.g., x86 vs ARM)
  • Supporting embedded, no_std, or wasm targets

In this section, we’ll explore how to detect targets using #[cfg(...)], and how to use Cargo to cross-compile or switch targets explicitly.


Example: OS-specific Behavior

main.rs

fn main() {
    print_platform_message();
}

#[cfg(target_os = "linux")]
fn print_platform_message() {
    println!("You're running on Linux.");
}

#[cfg(target_os = "windows")]
fn print_platform_message() {
    println!("You're running on Windows.");
}

#[cfg(target_os = "macos")]
fn print_platform_message() {
    println!("You're running on macOS.");
}

/*
Output (on Linux):
You're running on Linux.

Output (on Windows):
You're running on Windows.
*/

Run the app:

cargo run -p ferris_banner_demo
You're running on macOS.

This example uses target_os, but you can also use target_family, target_env, and other built-in cfg options.


Example: Architecture-specific Optimization

main.rs

fn main() {
    do_math();
}

#[cfg(target_arch = "x86_64")]
fn do_math() {
    println!("Running 64-bit optimized math routine.");
}

#[cfg(target_arch = "arm")]
fn do_math() {
    println!("Running ARM-optimized math routine.");
}

#[cfg(not(any(target_arch = "x86_64", target_arch = "arm")))]
fn do_math() {
    println!("Running fallback math routine.");
}

/*
Output (on x86_64):
Running 64-bit optimized math routine.
*/

Run the app:

cargo run -p ferris_banner_demo
Running 64-bit optimized math routine.

You can combine conditions using any(...), all(...), or not(...).


Example: Compile-only Constants per Target

If you want different constants or configuration values for each platform, you can conditionally define them:

main.rs

#[cfg(target_os = "windows")]
const CONFIG_PATH: &str = "C:\\Ferris\\config";

#[cfg(not(target_os = "windows"))]
const CONFIG_PATH: &str = "/etc/ferris/config";

fn main() {
    println!("Using config file at: {}", CONFIG_PATH);
}

/*
Output (on Windows):
Using config file at: C:\Ferris\config

Output (on Linux/macOS):
Using config file at: /etc/ferris/config
*/

Run the app:

cargo run -p ferris_banner_demo
Using config file at: /etc/ferris/config

This is especially useful for system paths, environment-dependent defaults, or third-party tool integrations.


Optional: Show Active Target Triple

(note this example often results in multiple platform-specific issues to work through)

You can inspect your active build target using:

rustc -vV
rustc 1.86.0 (05f9846f8 2025-03-31)
binary: rustc
commit-hash: 05f9846f893b09a1be1fc8560e33fc3c815cfecb
commit-date: 2025-03-31
host: x86_64-apple-darwin
release: 1.86.0
LLVM version: 19.1.7

Or get just the target triple with:

rustc -vV | grep host
host: x86_64-apple-darwin

To cross-compile, specify a target:

cargo build --target x86_64-unknown-linux-gnu

You may need to install the target with rustup target add <target>.


Summary

Use #[cfg(target_os = "...")], #[cfg(target_arch = "...")], and similar attributes to write platform-specific Rust code. This gives your crate portable, optimized, and safe behavior across environments.

  • Use cfg for branching code paths by target
  • Use conditional const values for target-specific defaults
  • Use --target to cross-compile and test across platforms

Cargo Commands You Might Be Missing

As your Rust workflow matures, it’s easy to stick with cargo build and cargo run for everything. But Cargo offers a growing toolbox of commands tailored for speed, debugging, introspection, and optimization — many of which can significantly improve your dev experience.

In this section, we’ll highlight lesser-known but incredibly useful commands you might be overlooking — and show when and why you should use them.


cargo check vs cargo build

When working on a Rust project, you might instinctively run cargo build just to make sure your code compiles. But for most day-to-day editing, there’s a faster, smarter option: cargo check.

Let’s explore the difference.


What cargo build Does

cargo build
  • Compiles your crate and all dependencies
  • Produces actual binary artifacts in target/
  • Slower than necessary when you just want to know if the code compiles

What cargo check Does

cargo check
  • Parses and analyzes your code for errors
  • Does not produce binaries or object files
  • Much faster because it skips the final codegen and linking stages

Side-by-side Example

Create a simple Rust crate:

cargo new check_vs_build_demo
cd check_vs_build_demo

Edit main.rs to add a small error:

fn main() {
    let x = "hello";
    println!("{} {}", x, y); // 'y' is undefined
}

Then try both commands:

cargo check
error[E0425]: cannot find value `y` in this scope
 --> src/main.rs:3:26
  |
3 |     println!("{} {}", x, y); // 'y' is undefined
  |                          ^ help: a local variable with a similar name exists: `x`

Runs quickly, just checks correctness.


cargo build

Same output, but slower, because it goes through full compilation phases even though the code is invalid.


When to Use Each

CommandPurpose
cargo checkFastest way to validate your code
cargo buildCompile and generate binaries
cargo runBuild and run the compiled binary

Use cargo check while iterating and editing. Only use cargo build when you need to run, benchmark, or debug the final binary.


Summary

  • Use cargo check for lightning-fast feedback
  • Use cargo build when you’re preparing to run, test, or release
  • Combine check into your edit-save loop for quicker iteration

cargo tree and cargo outdated

Understanding your crate’s dependencies — what you’re pulling in, why, and what versions — is crucial for performance, security, and maintenance. Rust gives you two fantastic tools for this: cargo tree and cargo outdated.

These commands help you:

  • Visualize your dependency graph (cargo tree)
  • Identify outdated crates and available upgrades (cargo outdated)

Let’s look at how to use both.


cargo tree: Visualizing Dependencies

cargo tree displays your crate’s dependency graph in a tree format, showing both direct and indirect dependencies.

cargo install cargo-tree

Cargo.toml

[dependencies]
serde = { version = "1.0", features = ["derive"] }
cargo tree

This gives you a tree view like:

check_vs_build_demo v0.1.0 (/Users/recording/Documents/backupDONOTDELETE.tmp/paragonica/bytemagma-blogs/rust/blog-projects/check_vs_build_demo)
└── serde v1.0.219
    └── serde_derive v1.0.219 (proc-macro)
        ├── proc-macro2 v1.0.95
        │   └── unicode-ident v1.0.18
        ├── quote v1.0.40
        │   └── proc-macro2 v1.0.95 (*)
        └── syn v2.0.100
            ├── proc-macro2 v1.0.95 (*)
            ├── quote v1.0.40 (*)
            └── unicode-ident v1.0.18

You can filter by specific crate:

cargo tree -p check_vs_build_demo

Example: Analyze a Real Crate

Cargo.toml

[dependencies]
reqwest = { version = "0.11", features = ["blocking"] }serde = { version = "1.0", features = ["derive"] }

main.rs

use reqwest::blocking::get;
use serde::Deserialize;

#[derive(Deserialize)]
struct Foo {
    bar: String,
}

fn main() {
    println!("Placeholder for dependency graph example.");
}

/*
Output:
Placeholder for dependency graph example.
*/

Run:

cargo tree

You’ll see reqwest pulling in tokio, hyper, serde_json, etc., even if you didn’t explicitly request them.


cargo outdated: Find Available Upgrades

This command compares the versions of your dependencies in use against what’s available on crates.io.

cargo install cargo-outdated
cargo outdated

It shows output like:

pgsqlCopyEditName        Project  Compat  Latest  Kind    Platform
serde       1.0.197  1.0.198 1.0.198 Normal  ---
tokio       1.35.1   1.37.0  1.37.0  Transitive
  • Project: current version in Cargo.lock
  • Compat: highest semver-compatible version
  • Latest: newest available version (even if breaking)
  • Kind: direct or transitive

Summary

  • cargo tree helps you understand your dependency graph
  • cargo outdated shows you what needs an upgrade
  • Keep these in your regular maintenance toolkit to avoid bloat and stale code

cargo expand and cargo bench

Rust gives you fine control over your code, performance, and tooling — and some of its most powerful tools are hidden just beneath the surface. Two standout examples are:

  • cargo expand, which shows how your macros and traits are transformed into actual code
  • cargo bench, which runs performance benchmarks using Criterion-style test harnesses

These commands give you visibility into code generation and runtime performance.


cargo expand: See the Code Rust Actually Compiles

Rust macros and derives often hide complexity. cargo expand lets you see the fully expanded version of your code — including all macro-generated code, trait implementations, and more.

cargo install cargo-expand

Cargo.toml

[dependencies]
serde = { version = "1.0", features = ["derive"] }

main.rs

use serde::Serialize;

#[derive(Serialize)]
struct User {
    id: u32,
    name: String,
}

fn main() {}

Run:

cargo expand

This will print out:

#![feature(prelude_import)]
#[prelude_import]
use std::prelude::rust_2024::*;
#[macro_use]
extern crate std;
use serde::Serialize;
struct User {
    id: u32,
    name: String,
}
#[doc(hidden)]
#[allow(
    non_upper_case_globals,
    unused_attributes,
    unused_qualifications,
    clippy::absolute_paths,
)]
... more output

You can also expand specific modules:

cargo expand --lib
cargo expand --bin my_binary

cargo bench: Benchmarking with Precision

Rust’s benchmarking support isn’t enabled by default, but with cargo bench, you can run performance tests using the nightly compiler.

rustup install nightly

To use nightly just for the current project:

Run this inside the project directory:

rustup override set nightly

Then in your crate:

cargo bench

To use Criterion (recommended over built-in benches), add this to Cargo.toml:

[dev-dependencies]
criterion = "0.5"

Then create a benches/ directory and add a file like benches/sorting.rs.

use criterion::{black_box, criterion_group, criterion_main, Criterion};

fn sorting_benchmark(c: &mut Criterion) {
    c.bench_function("sort 1000", |b| {
        b.iter(|| {
            let mut data: Vec<u32> = (0..1000).rev().collect();
            data.sort();
            black_box(data);
        });
    });
}

criterion_group!(benches, sorting_benchmark);
criterion_main!(benches);

Run it with:

cargo bench
Output: sorting_benchmark/sort 1000 time: [20.123 µs 20.456 µs 20.821 µs]

Summary

  • cargo expand helps you understand macro-generated code — especially for derive, procedural macros, and DSLs
  • cargo bench gives you fine-grained performance metrics — especially with Criterion
  • Combine both to debug macro magic and optimize your hot paths

In this post, we explored powerful yet often underused aspects of Cargo that can help you level up your Rust projects — from organizing multi-crate workspaces to fine-tuning builds, enabling conditional features, and benchmarking performance.

Here’s what we covered:

  • Workspaces for managing multi-package projects efficiently
  • Feature flags for toggling optional functionality and dependencies
  • Custom build profiles for controlling optimization, debug info, and compile time
  • Build scripts (build.rs) for injecting compile-time info or compiling C code
  • Conditional compilation using cfg, cfg_attr, and feature-driven logic
  • Essential Cargo commands like cargo check, cargo tree, cargo expand, cargo bench, and cargo outdated

Together, these tools help you write leaner, faster, and more maintainable Rust code while keeping your build logic clean and your developer workflow efficient.

Whether you’re building a CLI utility, a high-performance server, or a modular codebase with shared logic, mastering Cargo’s advanced features gives you the confidence and control to scale.


We hope this deep dive into Cargo will prove useful. Thank you so much for allowing ByteMagma for joining you in your journey toward Rust programming mastery!

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *