Ash.TypedStruct (ash v3.5.33)

View Source

A DSL for defining typed structs with field validation and constraints.

Ash.TypedStruct provides a convenient way to define a struct type in Ash.

Under the hood, it creates an Ash.Type.NewType with subtype_of: :struct and the appropriate constraints.

Example

defmodule MyApp.UserProfile do
  use Ash.TypedStruct 

  typed_struct do
    field :username, :string, allow_nil?: false
    field :email, :string, constraints: [match: ~r/@/]
    field :age, :integer, constraints: [min: 0, max: 150]
    field :bio, :string, default: ""
    field :verified, :boolean, default: false
  end
end

# Creating instances
{:ok, profile} = MyApp.UserProfile.new(username: "john", email: "john@example.com")

# Using new! for raising on errors
profile = MyApp.UserProfile.new!(username: "jane", email: "jane@example.com", age: 25)

# Can be used as an Ash type
defmodule MyApp.User do
  use Ash.Resource

  attributes do
    attribute :profile, MyApp.UserProfile
  end
end

Field Options

  • :type - The Ash type of the field (required)
  • :default - Default value for the field
  • :allow_nil? - Whether the field can be nil (defaults to true)
  • :constraints - Type-specific constraints (e.g., :min, :max, :match)
  • :description - Field documentation

Constructor Functions

The generated module includes:

  • new/1 - Returns {:ok, struct} or {:error, error}
  • new!/1 - Returns the struct or raises an error

Overriding new/1

You can override the new/1 function to add custom logic:

defmodule MyApp.CustomStruct do
  use Ash.TypedStruct

  typed_struct do
    field :name, :string, allow_nil?: false
    field :created_at, :utc_datetime
  end

  def new(params) do
    params = Map.put_new(params, :created_at, DateTime.utc_now())
    super(params)
  end
end

Options

  • :extensions (list of module that adopts Spark.Dsl.Extension) - A list of DSL extensions to add to the Spark.Dsl

  • :otp_app (atom/0) - The otp_app to use for any application configurable options

  • :fragments (list of module/0) - Fragments to include in the Spark.Dsl. See the fragments guide for more.