Finck, Steffen
Refine
Year of publication
Document Type
- Conference Proceeding (11)
- Article (6)
- Report (1)
- Working Paper (1)
Institute
Language
- English (15)
- German (3)
- Multiple languages (1)
Is part of the Bibliography
- yes (19)
Keywords
In this paper, we consider the question of data aggregation using the practical example of emissions data for economic activities for the sustainability assessment of regional bank clients. Given the current scarcity of company-specific emission data, an approximation relies on using available public data. These data are reported in different standards in different sources. To determine a mapping between the different standards, an adaptation to the Covariance Matrix Self-Adaptation Evolution Strategy is proposed. The obtained results show that high-quality mappings are found. Nevertheless, our approach is transferable to other data compatibility problems. These can be found in the merging of emissions data for other countries, or in bridging the gap between completely different data sets.
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
Im vorliegenden Paper wird ein Vergleich zwischen Produktions-und Simulationsdaten präsentiert welches im Rahmen einer größeren Initiative zur Verwendung von Shopfloor Daten bei einem Projektpartner in der Automobilindustrie umgesetzt wurde. In diesem Projekt wurden die Daten die während der Füllbildsimulation entstehen mit den Daten aus der finalen Werkzeugabnahme verglichen um zu analysieren, wie genau diese miteinander über einstimmen. Je besser die Simulation ist, desto schneller kann der gesamte Werkzeugentwicklungsprozess abgewickelt werden, welcher als Kernprozess massives Einsparungspotenzial und damit Wettbewerbsvorteil mit sich bringt.
Recent developments in the area of Natural Language Processing (NLP) increasingly allow for the extension of such techniques to hitherto unidentified areas of application. This paper deals with the application of state-of-the-art NLP techniques to the domain of Product Safety Risk Assessment (PSRA). PSRA is concerned with the quantification of the risks a user is exposed to during product use. The use case arises from an important process of maintaining due diligence towards the customers of the company OMICRON electronics GmbH.
The paper proposes an approach to evaluate the consistency of human-made risk assessments that are proposed by potentially changing expert panels. Along the stages of this NLP-based approach, multiple insights into the PSRA process allow for an improved understanding of the related risk distribution within the product portfolio of the company. The findings aim at making the current process more transparent as well as at automating repetitive tasks. The results of this paper can be regarded as a first step to support domain experts in the risk assessment process.
With Cloud Computing and multi-core CPUs parallel computing resources are becoming more and more affordable and commonly available. Parallel programming should as well be easily accessible for everyone. Unfortunately, existing frameworks and systems are powerful but often very complex to use for anyone who lacks the knowledge about underlying concepts. This paper introduces a software framework and execution environment whose objective is to provide a system which should be easily usable for everyone who could benefit from parallel computing. Some real-world examples are presented with an explanation of all the steps that are necessary for computing in a parallel and distributed manner.
Stress testing is part of today’s bank risk management and often required by the governing regulatory authority. Performing such a stress test with stress scenarios derived from a distribution, instead of pre-defined expert scenarios, results in a systematic approach in which new severe scenarios can be discovered. The required scenario distribution is obtained from historical time series via a Vector-Autoregressive time series model. The worst-case search, i.e. finding the scenario yielding the most severe situation for the bank, can be stated as an optimization problem. The problem itself is a constrained optimization problem in a high-dimensional search space. The constraints are the box constraints on the scenario variables and the plausibility of a scenario.
The latter is expressed by an elliptic constraint. As the evaluation of the stress scenarios is performed with a simulation tool, the optimization problem can be seen as black-box optimization problem. Evolution Strategy, a well-known optimizer for black-box problems, is applied here. The necessary adaptations to the algorithm are explained and a set of different algorithm design choices are investigated. It is shown that a simple box constraint handling method, i.e. setting variables which violate a box constraint to the respective boundary of the feasible domain, in combination with a repair of implausible scenarios provides good results.
In engineering design, optimization methods are frequently used to improve the initial design of a product. However, the selection of an appropriate method is challenging since many
methods exist, especially for the case of simulation-based optimization. This paper proposes a systematic procedure to support this selection process. Building upon quality function deployment, end-user and design use case requirements can be systematically taken into account via a decision
matrix. The design and construction of the decision matrix are explained in detail. The proposed
procedure is validated by two engineering optimization problems arising within the design of box-type boom cranes. For each problem, the problem statement and the respectively applied optimization methods are explained in detail. The results obtained by optimization validate the use
of optimization approaches within the design process. The application of the decision matrix shows the successful incorporation of customer requirements to the algorithm selection.