Refine
Year of publication
- 2023 (19) (remove)
Document Type
- Conference Proceeding (19) (remove)
Institute
- Forschungszentrum Business Informatics (6)
- Forschungszentrum Mikrotechnik (4)
- Forschungszentrum Energie (3)
- Forschungszentrum Human Centred Technologies (3)
- Technik | Engineering & Technology (3)
- Wirtschaft (3)
- Josef Ressel Zentrum für Intelligente Thermische Energiesysteme (2)
- Josef Ressel Zentrum für Robuste Entscheidungen (2)
- Forschung (1)
- Gestaltung (1)
Is part of the Bibliography
- yes (19)
Keywords
- agile software development (2)
- cloud computing (2)
- photonics (2)
- process management (2)
- software infrastructure (2)
- software process improvement (2)
- 3D MMI splitter (1)
- AWG design (1)
- Automobilindustrie (1)
- Blended Learning (1)
Digital twin as enabler of business model innovation for infrastructure construction projects
(2023)
Emerging technologies and methods are becoming an important element of the construction industry. Digital Twins are used as a base to store data in BIM models and make use out of the data respectively make the data visible. The transparency in all phases of the lifecycle of building and infrastructure assets is crucial in order to get a more efficient lifecycle of planning, construction and maintenance. Whereas other industries increased performance in these phases by making use out of the data, construction industry is stuck in traditional methods and business models. In this paper we propose a concept that focuses on the digital production twin. The comparison of planning data with As-Is production data can empower a data driven continuous improvement process and support the decision making process of future innovations and suitable business models. This paper outlines the possibility to use the data stored in a digital twin with regards to the evaluation of possible business models.
Through mandatory ESG (environmental, social, governance) reporting large companies must disclose their ESG activities showing how sustainability risks are incorporated in their decision-making and production processes. This disclosure obligation, however, does not apply to small and medium-sized enterprises (SME), creating a gap in the ESG dataset. Banks are therefore required to collect sustainability data of their SME customers independently to ensure complete ESG integration in the risk analysis process for loans. In this paper, we examine ESG risk analysis through a smart science approach laying the focus on possible value outcomes of sustainable smart services for banks as well as for their (SME) customers. The paper describes ESG factors, how services can be derived from them, targeted metrics of ESG and an ESG Service Creation Framework (business ecosystem building, process model, and value creation). The description of an exemplary use case highlighting the necessary ecosystem for service creation as well as the created value concludes the paper.
Immersive educational spaces
(2023)
"If only we had had such opportunities to grasp history like this when I was young" – words by an almost 80-year-old woman holding an iPad on which both, the buildings in the background and a tower in the form of a virtual 3D object, appear within reach. To "grasp" history - what an apt use of this action-oriented word for an augmented reality application built on considerations of thinking and acting in history. This telling image emerged during the first test run of the app i.appear which will be the focus of this article's considerations on the use of immersive learning environments. The application i.appear has been used in the city of Dornbirn (Austria) for a year now to teach historical content through location-based augmented reality and other interactive and multimedia technologies. After a brief description of the potential of such applications, the epistemological structure of the hosting app i.appear and its functionality will be outlined. This article will focus on the “Baroque Master Builders” tour of the hosting app that was created and tested as part of the current research.
In this paper, a 256-channel, 10-GHz arrayed waveguide gratings demultiplexer for ultra-dense wavelength division multiplexing was designed using an in-house developed tool called AWG-Parameters. The AWG demultiplexer was designed for a central wavelength of 1550 nm and the structure was simulated in PHASAR tool from Optiwave. Two different AWG designs were developed and the influence of the design parameters on the AWG performance was studied.
Design, simulation, and optimization of the 1×4 optical three-dimensional multimode interference splitter using IP-Dip polymer as a core and polydimethylsiloxane (PDMS) Sylgard 184 as a cladding is demonstrated. The splitter was simulated by using beam propagation method in BeamPROP simulation module of RSoft photonic tool and optimized for an operating wavelength of 1.55 μm . According to the minimum insertion loss, the dimensions of the splitter were optimized for a waveguide with a core size of 4×4 μm2 . The objective of the study is to create the design for fabrication by three-dimensional direct laser writing optical lithography.
Grey Box models provide an important approach for control analysis in the Heating, Ventilation and Air Conditioning (HVAC) sector. Grey Box models consist of physical models where parameters are estimated from data. Due to the vast amount of component models that can be found in literature, the question arises, which component models perform best on a given system or dataset? This question is investigated systematically using a test case system with real operational data. The test case system consists of a HVAC system containing an energy recovery unit (ER), a heating coil (HC) and a cooling coil (CC). For each component, several suitable model variants from the literature are adapted appropriately and implemented. Four model variants are implemented for the ER and five model variants each for the HC and CC. Further, three global optimization algorithms and four local optimization algorithms to solve the nonlinear least squares system identification are implemented, leading to a total of 700 combinations. The comparison of all variants shows that the global optimization algorithms do not provide significantly better solutions. Their runtimes are significantly higher. Analysis of the models shows a dependency of the model accuracy on the number of total parameters.
Im vorliegenden Paper wird ein Vergleich zwischen Produktions-und Simulationsdaten präsentiert welches im Rahmen einer größeren Initiative zur Verwendung von Shopfloor Daten bei einem Projektpartner in der Automobilindustrie umgesetzt wurde. In diesem Projekt wurden die Daten die während der Füllbildsimulation entstehen mit den Daten aus der finalen Werkzeugabnahme verglichen um zu analysieren, wie genau diese miteinander über einstimmen. Je besser die Simulation ist, desto schneller kann der gesamte Werkzeugentwicklungsprozess abgewickelt werden, welcher als Kernprozess massives Einsparungspotenzial und damit Wettbewerbsvorteil mit sich bringt.
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
Parametric anti-resonance is a phenomenon that occurs in systems with at least two degrees of freedom; this can be achieved by periodically exciting some parameters of the system. The effect of this properly tuned periodicity is to increase the dissipation in the system, which leads to a raising in the effective damping of vibrations. This contribution presents the design of an open-loop control to reduce the settling time using the anti-resonance concept. The control signal consists of a quasi-periodic signal capable of transferring the system’s oscillations from one mode to another mode of the system. The general averaging technique is used to characterize the dynamics, particularly the so-called slow dynamics of motion. With this analysis, the control signal is designed for the potential application of a microelectromechanical sensor arrangement; for this specific example, up to 96.8% reduction of settling time is achieved.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.