Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • Saudi Arabia’s new mud-brick museum blends tradition with modern design
    • Converge, a London-based startup, raises €19.4 million to decarbonise concrete with AI
    • The Best Gaming Headsets—We Tested Over Hundreds of Hours (2024)
    • Sega’s Re-Released Games for Switch 2 Include Yakuza 0 and Puyo Puyo Tetris 2S
    • Government takes aim at multiple parking app ‘hassle’
    • By putting AI into everything, Google wants to make it invisible 
    • New painkiller SBI-810 offers relief without opioid side effects
    • Paris-based Veesion raises €53 million to stop shoplifting with AI that understands gestures
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Wednesday, May 21
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»Artificial Intelligence»Optimizing Multi-Objective Problems with Desirability Functions
    Artificial Intelligence

    Optimizing Multi-Objective Problems with Desirability Functions

    Editor Times FeaturedBy Editor Times FeaturedMay 21, 2025No Comments9 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link


    in Data Science, it’s not unusual to come across issues with competing targets. Whether or not designing merchandise, tuning algorithms or optimizing portfolios, we frequently must stability a number of metrics to get the absolute best end result. Generally, maximizing one metrics comes on the expense of one other, making it arduous to have an total optimized answer.

    Whereas a number of options exist to resolve multi-objective Optimization issues, I discovered desirability perform to be each elegant and simple to elucidate to non-technical viewers. Which makes them an fascinating possibility to contemplate. Desirability features will mix a number of metrics right into a standardized rating, permitting for a holistic optimization.

    On this article, we’ll discover:

    • The mathematical basis of desirability features
    • The way to implement these features in Python
    • The way to optimize a multi-objective downside with desirability features
    • Visualization for interpretation and clarification of the outcomes

    To floor these ideas in an actual instance, we’ll apply desirability features to optimize a bread baking: a toy downside with a couple of, interconnected parameters and competing high quality targets that can permit us to discover a number of optimization decisions.

    By the tip of this text, you’ll have a strong new software in your knowledge science toolkit for tackling multi-objective optimization issues throughout quite a few domains, in addition to a totally useful code out there right here on GitHub.

    What are Desirability Features?

    Desirability features have been first formalized by Harrington (1965) and later prolonged by Derringer and Suich (1980). The thought is to:

    • Remodel every response right into a efficiency rating between 0 (completely unacceptable) and 1 (the best worth)
    • Mix all scores right into a single metric to maximise

    Let’s discover the forms of desirability features after which how we are able to mix all of the scores.

    The several types of desirability features

    There are three totally different desirability features, that may permit to deal with many conditions.

    • Smaller-is-better: Used when minimizing a response is fascinating
    def desirability_smaller_is_better(x: float, x_min: float, x_max: float) -> float:
        """Calculate desirability perform worth the place smaller values are higher.
    
        Args:
            x: Enter parameter worth
            x_min: Minimal acceptable worth
            x_max: Most acceptable worth
    
        Returns:
            Desirability rating between 0 and 1
        """
        if x <= x_min:
            return 1.0
        elif x >= x_max:
            return 0.0
        else:
            return (x_max - x) / (x_max - x_min)
    • Bigger-is-better: Used when maximizing a response is fascinating
    def desirability_larger_is_better(x: float, x_min: float, x_max: float) -> float:
        """Calculate desirability perform worth the place bigger values are higher.
    
        Args:
            x: Enter parameter worth
            x_min: Minimal acceptable worth
            x_max: Most acceptable worth
    
        Returns:
            Desirability rating between 0 and 1
        """
        if x <= x_min:
            return 0.0
        elif x >= x_max:
            return 1.0
        else:
            return (x - x_min) / (x_max - x_min)
    • Goal-is-best: Used when a particular goal worth is perfect
    def desirability_target_is_best(x: float, x_min: float, x_target: float, x_max: float) -> float:
        """Calculate two-sided desirability perform worth with goal worth.
    
        Args:
            x: Enter parameter worth
            x_min: Minimal acceptable worth
            x_target: Goal (optimum) worth
            x_max: Most acceptable worth
    
        Returns:
            Desirability rating between 0 and 1
        """
        if x_min <= x <= x_target:
            return (x - x_min) / (x_target - x_min)
        elif x_target < x <= x_max:
            return (x_max - x) / (x_max - x_target)
        else:
            return 0.0

    Each enter parameter could be parameterized with certainly one of these three desirability features, earlier than combining them right into a single desirability rating.

    Combining Desirability Scores

    As soon as particular person metrics are remodeled into desirability scores, they should be mixed into an total desirability. The commonest method is the geometric imply:

    The place di are particular person desirability values and wi are weights reflecting the relative significance of every metric.

    The geometric imply has an necessary property: if any single desirability is 0 (i.e. utterly unacceptable), the general desirability can also be 0, no matter different values. This enforces that every one necessities should be met to some extent.

    def overall_desirability(desirabilities, weights=None):
        """Compute total desirability utilizing geometric imply
        
        Parameters:
        -----------
        desirabilities : record
            Particular person desirability scores
        weights : record
            Weights for every desirability
            
        Returns:
        --------
        float
            Total desirability rating
        """
        if weights is None:
            weights = [1] * len(desirabilities)
            
        # Convert to numpy arrays
        d = np.array(desirabilities)
        w = np.array(weights)
        
        # Calculate geometric imply
        return np.prod(d ** w) ** (1 / np.sum(w))

    The weights are hyperparameters that give leverage on the ultimate end result and provides room for personalisation.

    A Sensible Optimization Instance: Bread Baking

    To display desirability features in motion, let’s apply them to a toy downside: a bread baking optimization downside.

    The Parameters and High quality Metrics

    Let’s play with the next parameters:

    1. Fermentation Time (30–180 minutes)
    2. Fermentation Temperature (20–30°C)
    3. Hydration Degree (60–85%)
    4. Kneading Time (0–20 minutes)
    5. Baking Temperature (180–250°C)

    And let’s attempt to optimize these metrics:

    1. Texture High quality: The feel of the bread
    2. Taste Profile: The flavour of the bread
    3. Practicality: The practicality of the entire course of

    After all, every of those metrics is dependent upon a couple of parameter. So right here comes probably the most essential steps: mapping parameters to high quality metrics. 

    For every high quality metric, we have to outline how parameters affect it:

    def compute_flavor_profile(params: Listing[float]) -> float:
        """Compute taste profile rating based mostly on enter parameters.
    
        Args:
            params: Listing of parameter values [fermentation_time, ferment_temp, hydration,
                   kneading_time, baking_temp]
    
        Returns:
            Weighted taste profile rating between 0 and 1
        """
        # Taste primarily affected by fermentation parameters
        fermentation_d = desirability_larger_is_better(params[0], 30, 180)
        ferment_temp_d = desirability_target_is_best(params[1], 20, 24, 28)
        hydration_d = desirability_target_is_best(params[2], 65, 75, 85)
    
        # Baking temperature has minimal impact on taste
        weights = [0.5, 0.3, 0.2]
        return np.common([fermentation_d, ferment_temp_d, hydration_d],
                          weights=weights)

    Right here for instance, the flavour is influenced by the next:

    • The fermentation time, with a minimal desirability under half-hour and a most desirability above 180 minutes
    • The fermentation temperature, with a most desirability peaking at 24 levels Celsius
    • The hydration, with a most desirability peaking at 75% humidity

    These computed parameters are then weighted averaged to return the flavour desirability. Related computations and made for the feel high quality and practicality.

    The Goal Operate

    Following the desirability perform method, we’ll use the general desirability as our goal perform. The purpose is to maximise this total rating, which implies discovering parameters that finest fulfill all our three necessities concurrently:

    def objective_function(params: Listing[float], weights: Listing[float]) -> float:
        """Compute total desirability rating based mostly on particular person high quality metrics.
    
        Args:
            params: Listing of parameter values
            weights: Weights for texture, taste and practicality scores
    
        Returns:
            Adverse total desirability rating (for minimization)
        """
        # Compute particular person desirability scores
        texture = compute_texture_quality(params)
        taste = compute_flavor_profile(params)
        practicality = compute_practicality(params)
    
        # Guarantee weights sum as much as one
        weights = np.array(weights) / np.sum(weights)
    
        # Calculate total desirability utilizing geometric imply
        overall_d = overall_desirability([texture, flavor, practicality], weights)
    
        # Return unfavorable worth since we wish to maximize desirability
        # however optimization features sometimes reduce
        return -overall_d

    After computing the person desirabilities for texture, taste and practicality; the general desirability is just computed with a weighted geometric imply. It lastly returns the unfavorable total desirability, in order that it may be minimized.

    Optimization with SciPy

    We lastly use SciPy’s reduce perform to search out optimum parameters. Since we returned the unfavorable total desirability as the target perform, minimizing it might maximize the general desirability:

    def optimize(weights: record[float]) -> record[float]:
        # Outline parameter bounds
        bounds = {
            'fermentation_time': (1, 24),
            'fermentation_temp': (20, 30),
            'hydration_level': (60, 85),
            'kneading_time': (0, 20),
            'baking_temp': (180, 250)
        }
    
        # Preliminary guess (center of bounds)
        x0 = [(b[0] + b[1]) / 2 for b in bounds.values()]
    
        # Run optimization
        outcome = reduce(
            objective_function,
            x0,
            args=(weights,),
            bounds=record(bounds.values()),
            methodology='SLSQP'
        )
    
        return outcome.x

    On this perform, after defining the bounds for every parameter, the preliminary guess is computed as the center of bounds, after which given as enter to the reduce perform of SciPy. The result’s lastly returned. 

    The weights are given as enter to the optimizer too, and are a great way to customise the output. For instance, with a bigger weight on practicality, the optimized answer will deal with practicality over taste and texture.

    Let’s now visualize the outcomes for a couple of units of weights.

    Visualization of Outcomes

    Let’s see how the optimizer handles totally different choice profiles, demonstrating the flexibleness of desirability features, given varied enter weights.

    Let’s take a look on the ends in case of weights favoring practicality:

    Optimized parameters with weights favoring practicality. Picture by writer.

    With weights largely in favor of practicality, the achieved total desirability is 0.69, with a brief kneading time of 5 minutes, since a excessive worth impacts negatively the practicality.

    Now, if we optimize with an emphasis on texture, we have now barely totally different outcomes:

    Optimized parameters with weights favoring texture. Picture by writer.

    On this case, the achieved total desirability is 0.85, considerably increased. The kneading time is that this time 12 minutes, as the next worth impacts positively the feel and isn’t penalized a lot due to practicality. 

    Conclusion: Sensible Functions of Desirability Features

    Whereas we centered on bread baking as our instance, the identical method could be utilized to numerous domains, reminiscent of product formulation in cosmetics or useful resource allocation in portfolio optimization.

    Desirability features present a strong mathematical framework for tackling multi-objective optimization issues throughout quite a few knowledge science functions. By reworking uncooked metrics into standardized desirability scores, we are able to successfully mix and optimize disparate targets.

    The important thing benefits of this method embody:

    • Standardized scales that make totally different metrics comparable and simple to mix right into a single goal
    • Flexibility to deal with several types of targets: reduce, maximize, goal
    • Clear communication of preferences by mathematical features

    The code offered right here gives a place to begin in your personal experimentation. Whether or not you’re optimizing industrial processes, machine studying fashions, or product formulations, hopefully desirability features supply a scientific method to discovering the very best compromise amongst competing targets.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    What the Most Detailed Peer-Reviewed Study on AI in the Classroom Taught Us

    May 20, 2025

    I Teach Data Viz with a Bag of Rocks

    May 20, 2025

    A New Frontier in Passive Investing

    May 20, 2025

    Boost 2-Bit LLM Accuracy with EoRA

    May 20, 2025

    🚪🚪🐐 Lessons in Decision Making from the Monty Hall Problem

    May 20, 2025

    How To Build a Benchmark for Your Models

    May 20, 2025
    Leave A Reply Cancel Reply

    Editors Picks

    Saudi Arabia’s new mud-brick museum blends tradition with modern design

    May 21, 2025

    Converge, a London-based startup, raises €19.4 million to decarbonise concrete with AI

    May 21, 2025

    The Best Gaming Headsets—We Tested Over Hundreds of Hours (2024)

    May 21, 2025

    Sega’s Re-Released Games for Switch 2 Include Yakuza 0 and Puyo Puyo Tetris 2S

    May 21, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    The Future of Adult Content Creation

    October 20, 2024

    How to Get Streaming Services Like Hulu for Free With Food Delivery Memberships

    February 2, 2025

    Pephop AI vs Crushon AI

    January 16, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.