Congratulations to the winners, and thank you for your active participation! Also, thanks for the technical support of SOYOTEC Technologies Corp.
The results of all participators have been verified by the organizers. The code will be uploaded in the website after the participant authorizes it. Since some of the articles related to these algorithms have not yet been published, in order to protect the privacy of these algorithms, the codes are temporarily not published. They will be published after the article is accepted by some publishers. If you are interested in these algorithms you can contact the authors directly (Guangyuan Sui: 1173864595@qq.com, Gejie Rang: 21gjrang@stu.edu.cn, Yulong Ye: liangzwh@gmail.com).
In the past decade, dynamic constrained multiobjective optimization has attracted the increasing research interest [1][2]. To the best of our knowledge, the problem is widely-spread in real-world applications, such as scheduling optimization, and resource allocation, which involves time-varying multiobjective and constraints [3]-[5]. More especially, the corresponding dynamic constrained multiobjective optimization problems (DCMOPs) contain more complex characteristics and special difficulties than dynamic multiobjective optimization or constrained multiobjective optimization ones [6]-[9]. To solve this kind of problem, traditional multiobjective evolutionary algorithms mainly face three difficulties. First, environmental changes can be described as various dynamics, forming different levels of difficulties to algorithms. Thus, there is no change response strategy that can deal with all kinds of dynamics. Second, different types of constraints may appear under dynamic environments, which pose a challenge to achieve good versatility in handling various constraints for any static optimizer. Finally, the response time for environmental changes is generally tight for algorithms. Concerning the above-mentioned analysis, there is a significant need for new mechanisms in solving DCMOPs. More especially, a set of diverse and unbiased test problems is a great demand to systematically study dynamic constrained multiobjective evolutionary algorithms (DCMOEAs) in the field [10][11].
To promote the research on dynamic constrained multiobjective optimization (DCMO), 10 benchmark functions are developed, covering diverse characteristics which exactly represent different real-world scenarios, for example, continuity-disconnection, time-dependent PF/PS geometries, dynamic infeasible region, small feasible region, and so on. The detailed definitions of these 10 test problems can be found in Benchmark Problems for CEC2023 Competition on Dynamic Constrained Multiobjective Optimization.
Based on the test suite with various characteristics, researchers can better understand the strengths and weaknesses of DCMOEAs, stimulating the research on dynamic constrained multiobjective optimization [12][13]. All the benchmark functions have been implemented in MATLAB code based on the codes provided by [14], which can be downloaded in the following website.
https://github.com/gychen94/DCMO
1) General settings:
Population size: 100.
Number of variables: 10.
Frequency of change (τt ): 10 (fast changing environments), 30 (slow changing environments).
Severity of change (nt ): 5 (severe changing environments), 10 (moderate changing environments).
Number of changes: 60.
Stopping criterion: a maximum number of 100(60τt+60) fitness evaluations, where 6000 fitness evaluations are given before the first environmental change occurs.
Number of independent runs: 20.
2) Performance metric:
The MIGD [15] is used to evaluate the performance of an optimizer on each DCMOP. A smaller MIGD value indicates a better performance of the corresponding optimizer.
Moreover, the MHV [16] is used to measure the comprehensive performance of an optimizer on DCMOPs. A larger MHV value indicates a better performance of the corresponding optimizer.
To submit the result, it is expected to format the submitted competition results in tables as the same as the following Table. More especially, please do make sure that the submitted results are of high readability, and multiple types of results shown in Table are clearly recorded, including the mean and standard deviation of the MIGD/MHV values for each test instance.
For all participants, please also submit the corresponding source code which should allow the generation of reproducible results you're submitted. Besides, it would be nice if you can submit a document that gives a brief illustration to the algorithm and corresponding parameter settings.
If you have any queries, please kindly contact us (chenguoyumail@163.com). We will follow the deadlines as determined by CEC 2023. If you require extra time, you can also contact us by email. If you have suggestions to improve the technical report or if you find any potential bug in the codes, please don't hesitate to tell us by email, so that we can update you about any bug fixings and/or the extension of the deadline.
Table: MIGD and MHV results obtained by your algorithm on DCF
Problem | (τt ,nt ) | MIGD (mean(std.)) | MHV (mean(std.)) |
DCF1 | 10,5 | 1.1111E-2(1.1111E-3) | 1.1111E-2(1.1111E-3) |
10,10 | |||
30,5 | |||
30,10 | |||
... | |||
DCF10 | 10,5 | ||
10,10 | |||
30,5 | |||
30,10 |
Based on the ranking of the competition algorithm in the above two indicators, a final Score is calculated to evaluate the performance of the algorithm.
Score consists of two ranking syntheses (SR) :
where rank is obtained by the mean and standard deviation value, which is calculated based on the given test function and two performance indicators. When SR values of different algorithms are obtained, scores can be calculated as follows:
IEEE CEC 2023 conference certificates and prize money will be awarded to the winners of this competition(1st to 3rd place).
For participants planning to submit a paper to the 2023 IEEE Congress on Evolutionary Computation:
Paper submission: 13 Jan 2023 27 Jan 2023
Paper reviews: 3 Mar 2023 17 Mar 2023
Paper re-submission: 24 Mar 2023 7 April 2023
Paper final notifications: 31 Mar 2023 14 April 2023
Note: You are encouraged to submit your paper to the given at: https://2023.ieee-cec.org/
Participants for competition only:
Results submission deadline: 31 Mar 2023 30 April 2023
Note: Please send your results directly to Mr. Guoyu Chen (chenguoyumail@163.com)
Yinan Guo
School of Mechanical Electronic and Information Engineering, China University of Mining and Technology (Beijing), Beijing, China
E-mail: nanfly@126.com
Guoyu Chen
School of Information and Control Engineering, China University of Mining and Technology, Xuzhou, China
E-mail: chenguoyumail@163.com
Caitong Yue
School of Electrical Engineering, Zhengzhou University, Zhengzhou, China
E-mail: zzuyuecaitong@163.com
Jing Liang
School of Electrical Engineering, Zhengzhou University, Zhengzhou, China
E-mail: liangjing@zzu.edu.cn
Yong Wang
School of Automation, Central South University, Changsha, China
E-mail: ywang@csu.edu.cn
Shengxiang Yang
School of Computer Science and Informatics, De Montfort University, Leicester LE1 9BH, U.K.
E-mail: syang@dmu.ac.uk