Skip to content

Pipeline Generators

What is the Pipeline Generator module?

The Pipeline Generator module is solving the following scenario, imagine telling UrbanMapper exactly what urban analysis you want, and watching it craft a pipeline for you—no coding required. Powered by Large Language Models (LLMs), this module transforms your natural language descriptions into executable Python code for UrbanMapper pipelines.

Meanwhile, we recommend to look through the Example's Pipeline Generator for a more hands-on introduction about the Pipeline Generator module and its usage.

Documentation Under Alpha Construction

This documentation is in its early stages and still being developed. The API may therefore change, and some parts might be incomplete or inaccurate.

Use at your own risk, and please report anything that seems incorrect / outdated you find.

Open An Issue!

PipelineGeneratorBase

Bases: ABC

Abstract base class for pipeline generators.

This class defines the interface for pipeline generators.

What is a pipeline geneartor's primitive

Pipeline generators use large language models (LLMs) to automatically create UrbanMapper pipelines from natural language descriptions.

Implementations of this class must provide a generate_urban_pipeline method that takes a user description and returns Python code for an UrbanMapper pipeline.

Use of Short Name

The short name of the generator is used to identify the generator in the PipelineGeneratorFactory. It should be unique among all generators.

For instance, much easier to call GPT4 than GPT4Generator. See further in the factory.

Attributes:

Name Type Description
instructions

The instructions to guide the LLM in generating pipelines.

Examples:

>>> class GPT4Generator(PipelineGeneratorBase):
...     short_name = "GPT4"
...
...     def __init__(self, instructions: str):
...         self.instructions = instructions
...
...     def generate_urban_pipeline(self, user_description: str) -> str:
...         # Implementation that uses GPT-4 to generate a pipeline
...         ...
Source code in src/urban_mapper/modules/pipeline_generator/abc_pipeline_generator.py
class PipelineGeneratorBase(ABC):
    """Abstract base class for pipeline generators.

    This class defines the interface for pipeline generators.

    !!! question "What is a pipeline geneartor's primitive"
        Pipeline generators use large language models (LLMs) to automatically
        create `UrbanMapper pipelines` from `natural language descriptions`.

    Implementations of this class must provide a `generate_urban_pipeline` method
    that takes a user description and returns Python code for an `UrbanMapper pipeline`.

    !!! note "Use of Short Name"
        The short name of the generator is used to identify the generator in the
        `PipelineGeneratorFactory`. It should be unique among all generators.

        For instance, much easier to call GPT4 than `GPT4Generator`. See further in the factory.

    Attributes:
        instructions: The instructions to guide the LLM in generating pipelines.

    Examples:
        >>> class GPT4Generator(PipelineGeneratorBase):
        ...     short_name = "GPT4"
        ...
        ...     def __init__(self, instructions: str):
        ...         self.instructions = instructions
        ...
        ...     def generate_urban_pipeline(self, user_description: str) -> str:
        ...         # Implementation that uses GPT-4 to generate a pipeline
        ...         ...
    """

    @abstractmethod
    def generate_urban_pipeline(self, user_description: str) -> str:
        """Generate an `UrbanMapper pipeline` from a `natural language description`.

        This method uses a `large language model` to generate `Python code` for an
        `UrbanMapper pipeline` based on the user's `natural language description`.
        The generated code can then be executed to create and run the pipeline.

        Args:
            user_description: A natural language description of the desired pipeline,
                such as "Load traffic data for New York and visualise accident hotspots."

        Returns:
            A string containing Python code that implements the described pipeline.
            This code can be executed with exec() to run the pipeline.

        Examples:
            >>> generator = SomeGenerator(instructions)
            >>> pipeline_code = generator.generate_urban_pipeline(
            ...     "Load taxi trip data for Manhattan and create a heatmap of pickups"
            ... )
            >>> print(pipeline_code) # You may use Ipyleaflet Code(.) for highlighting, or even `exec(pipeline_code)` for running, yet this is not recommended.
        """
        pass

generate_urban_pipeline(user_description) abstractmethod

Generate an UrbanMapper pipeline from a natural language description.

This method uses a large language model to generate Python code for an UrbanMapper pipeline based on the user's natural language description. The generated code can then be executed to create and run the pipeline.

Parameters:

Name Type Description Default
user_description str

A natural language description of the desired pipeline, such as "Load traffic data for New York and visualise accident hotspots."

required

Returns:

Type Description
str

A string containing Python code that implements the described pipeline.

str

This code can be executed with exec() to run the pipeline.

Examples:

>>> generator = SomeGenerator(instructions)
>>> pipeline_code = generator.generate_urban_pipeline(
...     "Load taxi trip data for Manhattan and create a heatmap of pickups"
... )
>>> print(pipeline_code) # You may use Ipyleaflet Code(.) for highlighting, or even `exec(pipeline_code)` for running, yet this is not recommended.
Source code in src/urban_mapper/modules/pipeline_generator/abc_pipeline_generator.py
@abstractmethod
def generate_urban_pipeline(self, user_description: str) -> str:
    """Generate an `UrbanMapper pipeline` from a `natural language description`.

    This method uses a `large language model` to generate `Python code` for an
    `UrbanMapper pipeline` based on the user's `natural language description`.
    The generated code can then be executed to create and run the pipeline.

    Args:
        user_description: A natural language description of the desired pipeline,
            such as "Load traffic data for New York and visualise accident hotspots."

    Returns:
        A string containing Python code that implements the described pipeline.
        This code can be executed with exec() to run the pipeline.

    Examples:
        >>> generator = SomeGenerator(instructions)
        >>> pipeline_code = generator.generate_urban_pipeline(
        ...     "Load taxi trip data for Manhattan and create a heatmap of pickups"
        ... )
        >>> print(pipeline_code) # You may use Ipyleaflet Code(.) for highlighting, or even `exec(pipeline_code)` for running, yet this is not recommended.
    """
    pass

GPT4PipelineGenerator

Bases: PipelineGeneratorBase

Generates UrbanMapper pipelines using GPT-4.

This class uses the GPT-4 language model via the ell library to generate Python code for UrbanMapper pipelines based on user-provided instructions and descriptions.

What is ell

ell is a lightweight, functional prompt engineering framework built on a few core principles:

  • Prompts are programs, not strings.
  • Prompts are actually parameters of a machine learning model.
  • Tools for monitoring, versioning, and visualization
  • Multimodality should be first class
  • ...and much more!

See more in ell github repository.

Short Name

To use this primitive, when calling with_LLM(.) make sure to write gpt-4 as the short name.

Source code in src/urban_mapper/modules/pipeline_generator/generators/gpt4_pipeline_generator.py
class GPT4PipelineGenerator(PipelineGeneratorBase):
    """Generates `UrbanMapper pipelines` using GPT-4.

    This class uses the GPT-4 language model via the `ell` library to generate
    Python code for `UrbanMapper pipelines` based on user-provided instructions and descriptions.

    !!! question "What is `ell`"
        `ell` is a lightweight, functional prompt engineering framework built on a few core principles:

        - [x] Prompts are programs, not strings.
        - [x] Prompts are actually parameters of a machine learning model.
        - [x] Tools for monitoring, versioning, and visualization
        - [x] Multimodality should be first class
        - [x] ...and much more!

        See more in [ell github repository](in https://github.com/MadcowD/ell).

    !!! note "Short Name"
        To use this primitive, when calling `with_LLM(.)` make sure to write `gpt-4` as the short name.
    """

    short_name = "gpt-4"

    def __init__(self, instructions: str):
        self.instructions = instructions

    @check_openai_api_key
    def generate_urban_pipeline(self, user_description: str) -> str:
        @ell.simple(model="gpt-4")
        def generate_code():
            return f"{self.instructions}\n\nUser Description: {user_description}\n\nGenerate the Python code for the UrbanPipeline:"

        return generate_code()

generate_urban_pipeline(user_description)

Source code in src/urban_mapper/modules/pipeline_generator/generators/gpt4_pipeline_generator.py
@check_openai_api_key
def generate_urban_pipeline(self, user_description: str) -> str:
    @ell.simple(model="gpt-4")
    def generate_code():
        return f"{self.instructions}\n\nUser Description: {user_description}\n\nGenerate the Python code for the UrbanPipeline:"

    return generate_code()

GPT4OPipelineGenerator

Bases: PipelineGeneratorBase

Generates UrbanMapper pipelines using GPT-4o.

This class uses the GPT-4o language model via the ell library to generate Python code for UrbanMapper pipelines based on user-provided instructions and descriptions.

What is ell

ell is a lightweight, functional prompt engineering framework built on a few core principles:

  • Prompts are programs, not strings.
  • Prompts are actually parameters of a machine learning model.
  • Tools for monitoring, versioning, and visualization
  • Multimodality should be first class
  • ...and much more!

See more in ell github repository.

Short Name

To use this primitive, when calling with_LLM(.) make sure to write gpt-4o as the short name.

Source code in src/urban_mapper/modules/pipeline_generator/generators/gpt4o_pipeline_generator.py
class GPT4OPipelineGenerator(PipelineGeneratorBase):
    """Generates `UrbanMapper pipelines` using GPT-4o.

    This class uses the GPT-4o language model via the `ell` library to generate
    Python code for `UrbanMapper pipelines` based on user-provided instructions and descriptions.

    !!! question "What is `ell`"
        `ell` is a lightweight, functional prompt engineering framework built on a few core principles:

        - [x] Prompts are programs, not strings.
        - [x] Prompts are actually parameters of a machine learning model.
        - [x] Tools for monitoring, versioning, and visualization
        - [x] Multimodality should be first class
        - [x] ...and much more!

        See more in [ell github repository](in https://github.com/MadcowD/ell).

    !!! note "Short Name"
        To use this primitive, when calling `with_LLM(.)` make sure to write `gpt-4o` as the short name.
    """

    short_name = "gpt-4o"

    def __init__(self, instructions: str):
        self.instructions = instructions

    @check_openai_api_key
    def generate_urban_pipeline(self, user_description: str) -> str:
        @ell.simple(model="gpt-4o")
        def generate_code():
            return f"{self.instructions}\n\nUser Description: {user_description}\n\nGenerate the Python code for the UrbanPipeline:"

        return generate_code()

generate_urban_pipeline(user_description)

Source code in src/urban_mapper/modules/pipeline_generator/generators/gpt4o_pipeline_generator.py
@check_openai_api_key
def generate_urban_pipeline(self, user_description: str) -> str:
    @ell.simple(model="gpt-4o")
    def generate_code():
        return f"{self.instructions}\n\nUser Description: {user_description}\n\nGenerate the Python code for the UrbanPipeline:"

    return generate_code()

GPT35TurboPipelineGenerator

Bases: PipelineGeneratorBase

Generates UrbanMapper pipelines using GPT-3.5-turbo.

This class uses the GPT-3.5-turbo language model via the ell library to generate Python code for UrbanMapper pipelines based on user-provided instructions and descriptions.

What is ell

ell is a lightweight, functional prompt engineering framework built on a few core principles:

  • Prompts are programs, not strings.
  • Prompts are actually parameters of a machine learning model.
  • Tools for monitoring, versioning, and visualization
  • Multimodality should be first class
  • ...and much more!

See more in ell github repository.

Short Name

To use this primitive, when calling with_LLM(.) make sure to write gpt-3.5-turbo as the short name.

Source code in src/urban_mapper/modules/pipeline_generator/generators/gpt35turbo_pipeline_generator.py
class GPT35TurboPipelineGenerator(PipelineGeneratorBase):
    """Generates `UrbanMapper pipelines` using GPT-3.5-turbo.

    This class uses the GPT-3.5-turbo language model via the `ell` library to generate
    Python code for `UrbanMapper pipelines` based on user-provided instructions and descriptions.

    !!! question "What is `ell`"
        `ell` is a lightweight, functional prompt engineering framework built on a few core principles:

        - [x] Prompts are programs, not strings.
        - [x] Prompts are actually parameters of a machine learning model.
        - [x] Tools for monitoring, versioning, and visualization
        - [x] Multimodality should be first class
        - [x] ...and much more!

        See more in [ell github repository](in https://github.com/MadcowD/ell).

    !!! note "Short Name"
        To use this primitive, when calling `with_LLM(.)` make sure to write `gpt-3.5-turbo` as the short name.
    """

    short_name = "gpt-3.5-turbo"

    def __init__(self, instructions: str):
        self.instructions = instructions

    @check_openai_api_key
    def generate_urban_pipeline(self, user_description: str) -> str:
        @ell.simple(model="gpt-3.5-turbo")
        def generate_code():
            return f"{self.instructions}\n\nUser Description: {user_description}\n\nGenerate the Python code for the UrbanPipeline:"

        return generate_code()

generate_urban_pipeline(user_description)

Source code in src/urban_mapper/modules/pipeline_generator/generators/gpt35turbo_pipeline_generator.py
@check_openai_api_key
def generate_urban_pipeline(self, user_description: str) -> str:
    @ell.simple(model="gpt-3.5-turbo")
    def generate_code():
        return f"{self.instructions}\n\nUser Description: {user_description}\n\nGenerate the Python code for the UrbanPipeline:"

    return generate_code()

PipelineGeneratorFactory

Factory class for creating and configuring pipeline generators.

This class implements a fluent chaining-methods-based interface for creating and configuring pipeline generators. Pipeline generators use Large Language Models (LLMs) to automatically create UrbanMapper pipelines from natural language descriptions.

The factory manages the details of generator instantiation, configuration, and execution, providing a consistent interface regardless of the underlying LLM implementation.

Attributes:

Name Type Description
_type

The type of LLM-based generator to create.

_custom_instructions

Optional custom instructions to guide the LLM.

Examples:

>>> from urban_mapper import UrbanMapper
>>> mapper = UrbanMapper()
>>> pipeline_code = mapper.pipeline_generator.with_LLM("GPT4")        ...     .generate_urban_pipeline(
...         "Load taxi trips in Manhattan and show the count of pickups per street segments (roads)."
...     )
Source code in src/urban_mapper/modules/pipeline_generator/pipeline_generator_factory.py
class PipelineGeneratorFactory:
    """Factory class for creating and configuring pipeline generators.

    This class implements a fluent chaining-methods-based interface for creating and configuring pipeline
    generators. `Pipeline generators` use `Large Language Models (LLMs)` to automatically create `UrbanMapper pipelines`
    from natural language descriptions.

    The factory manages the details of `generator instantiation`, `configuration`, and
    `execution`, providing a consistent interface regardless of the underlying LLM implementation.

    Attributes:
        _type: The type of LLM-based generator to create.
        _custom_instructions: Optional custom instructions to guide the LLM.

    Examples:
        >>> from urban_mapper import UrbanMapper
        >>> mapper = UrbanMapper()
        >>> pipeline_code = mapper.pipeline_generator.with_LLM("GPT4")\
        ...     .generate_urban_pipeline(
        ...         "Load taxi trips in Manhattan and show the count of pickups per street segments (roads)."
        ...     )
    """

    def __init__(self):
        self._type = None
        self._custom_instructions = None

    def with_LLM(self, primitive_type: str) -> "PipelineGeneratorFactory":
        """Specify the LLM to use for pipeline generation.

        !!! note "See This As `with.type(.)`"
            In mostly all other modules we use `with.type(.)` to specify the type of
            primitive given the module we are trying to use. Here we name it `with_LLM`
            for the sake of clarity and given the very little scope the current module has.

        This method sets the type of LLM to use for generating pipelines. Available
        types are registered in the `PIPELINE_GENERATOR_REGISTRY` or simply by perusing the folder
        `generators` in `src/urban_mapper/modules/pipeline_generator/`, they are all
        subclasses of `PipelineGeneratorBase`. The short name of the generator is used
        to identify the generator in the factory. It should be unique among all generators.

        !!! warning "Naming Mistakes"
            If you make a mistake in the name of the generator, the factory will
            provide a probable suggestion based on the available names. This is done using
            the `fuzzywuzzy` library, which uses Levenshtein distance to find the closest match.

            Pretty cool addition to `UrbanMapper`!

        Args:
            primitive_type: The name of the LLM type to use, such as "GPT4" or "GPT35Turbo".

        Returns:
            The PipelineGeneratorFactory instance for method chaining.

        Raises:
            ValueError: If the specified LLM type is not found in the registry.

        Examples:
            >>> generator = mapper.pipeline_generator.with_LLM("GPT4")
        """
        if primitive_type not in PIPELINE_GENERATOR_REGISTRY:
            available = list(PIPELINE_GENERATOR_REGISTRY.keys())
            match, score = process.extractOne(primitive_type, available)
            suggestion = f" Maybe you meant '{match}'?" if score > 80 else ""
            raise ValueError(
                f"Unknown generator type '{primitive_type}'. Available: {', '.join(available)}.{suggestion}"
            )
        self._type = primitive_type
        return self

    def with_custom_instructions(self, instructions: str) -> "PipelineGeneratorFactory":
        """Set custom instructions for guiding the LLM.

        This method provides custom instructions to guide the LLM in generating
        pipelines. This can be used to override the default instructions or to
        provide additional context or constraints.

        !!! question "What is the default instructions file?"
            The default instructions file is located in: `src/urban_mapper/modules/pipeline_generator/instructions.txt`.

        Args:
            instructions: The custom instructions to provide to the LLM.

        Returns:
            The PipelineGeneratorFactory instance for method chaining.

        Examples:
            >>> instructions = "Generate a pipeline that focuses on urban mobility..."
            >>> generator = mapper.pipeline_generator.with_custom_instructions(instructions)

            >>> # Using a file reading
            >>> with open("path/to/custom_instructions.txt", "r") as file:
            ...     instructions = file.read()
            >>> generator = mapper.pipeline_generator.with_custom_instructions(instructions)
        """
        self._custom_instructions = instructions
        return self

    def _build(self) -> PipelineGeneratorBase:
        """Build and return the configured pipeline generator instance.

        This method finalises the configuration and instantiates the appropriate
        pipeline generator implementation. It's an internal method used by
        the generate_urban_pipeline method.

        Returns:
            An instance of a class derived from PipelineGeneratorBase, configured
            according to the settings specified through the factory methods.

        Raises:
            ValueError: If the LLM type is not specified or is invalid.
        """
        if self._type not in PIPELINE_GENERATOR_REGISTRY:
            raise ValueError(f"Unknown generator type: {self._type}")
        instructions = (
            self._custom_instructions or open(DEFAULT_INSTRUCTIONS_FILE, "r").read()
        )
        generator_class = PIPELINE_GENERATOR_REGISTRY[self._type]
        return generator_class(instructions)

    def generate_urban_pipeline(self, user_description: str) -> str:
        """Generate an `UrbanMapper pipeline` suggestion from a `natural language description`.

        This method uses the configured LLM to generate Python code for an
        `UrbanMapper pipeline` based on the user's `natural language description`.
        The generated code can then be executed to create and run the pipeline.

        Args:
            user_description: A natural language description of the desired pipeline,
                such as "Load traffic data for New York and visualise accident hotspots."

        Returns:
            A string containing Python code that implements the described pipeline.
            This code can be executed with exec() to run the pipeline.

        Examples:
            >>> pipeline_code = PipelineGeneratorFactory().with_LLM("GPT4")\
            ...     .generate_urban_pipeline(
            ...         "Load taxi trip data for Manhattan and create a heatmap of pickups"
            ...     )
        """
        generator = self._build()
        return generator.generate_urban_pipeline(user_description)

with_LLM(primitive_type)

Specify the LLM to use for pipeline generation.

See This As with.type(.)

In mostly all other modules we use with.type(.) to specify the type of primitive given the module we are trying to use. Here we name it with_LLM for the sake of clarity and given the very little scope the current module has.

This method sets the type of LLM to use for generating pipelines. Available types are registered in the PIPELINE_GENERATOR_REGISTRY or simply by perusing the folder generators in src/urban_mapper/modules/pipeline_generator/, they are all subclasses of PipelineGeneratorBase. The short name of the generator is used to identify the generator in the factory. It should be unique among all generators.

Naming Mistakes

If you make a mistake in the name of the generator, the factory will provide a probable suggestion based on the available names. This is done using the fuzzywuzzy library, which uses Levenshtein distance to find the closest match.

Pretty cool addition to UrbanMapper!

Parameters:

Name Type Description Default
primitive_type str

The name of the LLM type to use, such as "GPT4" or "GPT35Turbo".

required

Returns:

Type Description
PipelineGeneratorFactory

The PipelineGeneratorFactory instance for method chaining.

Raises:

Type Description
ValueError

If the specified LLM type is not found in the registry.

Examples:

>>> generator = mapper.pipeline_generator.with_LLM("GPT4")
Source code in src/urban_mapper/modules/pipeline_generator/pipeline_generator_factory.py
def with_LLM(self, primitive_type: str) -> "PipelineGeneratorFactory":
    """Specify the LLM to use for pipeline generation.

    !!! note "See This As `with.type(.)`"
        In mostly all other modules we use `with.type(.)` to specify the type of
        primitive given the module we are trying to use. Here we name it `with_LLM`
        for the sake of clarity and given the very little scope the current module has.

    This method sets the type of LLM to use for generating pipelines. Available
    types are registered in the `PIPELINE_GENERATOR_REGISTRY` or simply by perusing the folder
    `generators` in `src/urban_mapper/modules/pipeline_generator/`, they are all
    subclasses of `PipelineGeneratorBase`. The short name of the generator is used
    to identify the generator in the factory. It should be unique among all generators.

    !!! warning "Naming Mistakes"
        If you make a mistake in the name of the generator, the factory will
        provide a probable suggestion based on the available names. This is done using
        the `fuzzywuzzy` library, which uses Levenshtein distance to find the closest match.

        Pretty cool addition to `UrbanMapper`!

    Args:
        primitive_type: The name of the LLM type to use, such as "GPT4" or "GPT35Turbo".

    Returns:
        The PipelineGeneratorFactory instance for method chaining.

    Raises:
        ValueError: If the specified LLM type is not found in the registry.

    Examples:
        >>> generator = mapper.pipeline_generator.with_LLM("GPT4")
    """
    if primitive_type not in PIPELINE_GENERATOR_REGISTRY:
        available = list(PIPELINE_GENERATOR_REGISTRY.keys())
        match, score = process.extractOne(primitive_type, available)
        suggestion = f" Maybe you meant '{match}'?" if score > 80 else ""
        raise ValueError(
            f"Unknown generator type '{primitive_type}'. Available: {', '.join(available)}.{suggestion}"
        )
    self._type = primitive_type
    return self

with_custom_instructions(instructions)

Set custom instructions for guiding the LLM.

This method provides custom instructions to guide the LLM in generating pipelines. This can be used to override the default instructions or to provide additional context or constraints.

What is the default instructions file?

The default instructions file is located in: src/urban_mapper/modules/pipeline_generator/instructions.txt.

Parameters:

Name Type Description Default
instructions str

The custom instructions to provide to the LLM.

required

Returns:

Type Description
PipelineGeneratorFactory

The PipelineGeneratorFactory instance for method chaining.

Examples:

>>> instructions = "Generate a pipeline that focuses on urban mobility..."
>>> generator = mapper.pipeline_generator.with_custom_instructions(instructions)
>>> # Using a file reading
>>> with open("path/to/custom_instructions.txt", "r") as file:
...     instructions = file.read()
>>> generator = mapper.pipeline_generator.with_custom_instructions(instructions)
Source code in src/urban_mapper/modules/pipeline_generator/pipeline_generator_factory.py
def with_custom_instructions(self, instructions: str) -> "PipelineGeneratorFactory":
    """Set custom instructions for guiding the LLM.

    This method provides custom instructions to guide the LLM in generating
    pipelines. This can be used to override the default instructions or to
    provide additional context or constraints.

    !!! question "What is the default instructions file?"
        The default instructions file is located in: `src/urban_mapper/modules/pipeline_generator/instructions.txt`.

    Args:
        instructions: The custom instructions to provide to the LLM.

    Returns:
        The PipelineGeneratorFactory instance for method chaining.

    Examples:
        >>> instructions = "Generate a pipeline that focuses on urban mobility..."
        >>> generator = mapper.pipeline_generator.with_custom_instructions(instructions)

        >>> # Using a file reading
        >>> with open("path/to/custom_instructions.txt", "r") as file:
        ...     instructions = file.read()
        >>> generator = mapper.pipeline_generator.with_custom_instructions(instructions)
    """
    self._custom_instructions = instructions
    return self

generate_urban_pipeline(user_description)

Generate an UrbanMapper pipeline suggestion from a natural language description.

This method uses the configured LLM to generate Python code for an UrbanMapper pipeline based on the user's natural language description. The generated code can then be executed to create and run the pipeline.

Parameters:

Name Type Description Default
user_description str

A natural language description of the desired pipeline, such as "Load traffic data for New York and visualise accident hotspots."

required

Returns:

Type Description
str

A string containing Python code that implements the described pipeline.

str

This code can be executed with exec() to run the pipeline.

Examples:

>>> pipeline_code = PipelineGeneratorFactory().with_LLM("GPT4")            ...     .generate_urban_pipeline(
...         "Load taxi trip data for Manhattan and create a heatmap of pickups"
...     )
Source code in src/urban_mapper/modules/pipeline_generator/pipeline_generator_factory.py
def generate_urban_pipeline(self, user_description: str) -> str:
    """Generate an `UrbanMapper pipeline` suggestion from a `natural language description`.

    This method uses the configured LLM to generate Python code for an
    `UrbanMapper pipeline` based on the user's `natural language description`.
    The generated code can then be executed to create and run the pipeline.

    Args:
        user_description: A natural language description of the desired pipeline,
            such as "Load traffic data for New York and visualise accident hotspots."

    Returns:
        A string containing Python code that implements the described pipeline.
        This code can be executed with exec() to run the pipeline.

    Examples:
        >>> pipeline_code = PipelineGeneratorFactory().with_LLM("GPT4")\
        ...     .generate_urban_pipeline(
        ...         "Load taxi trip data for Manhattan and create a heatmap of pickups"
        ...     )
    """
    generator = self._build()
    return generator.generate_urban_pipeline(user_description)
Provost Simon