Conference Proceeding
Refine
Year of publication
Document Type
- Conference Proceeding (396) (remove)
Institute
- Forschungszentrum Mikrotechnik (137)
- Technik | Engineering & Technology (85)
- Forschungszentrum Business Informatics (68)
- Department of Computer Science (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (62)
- Forschungszentrum Human Centred Technologies (52)
- Wirtschaft (50)
- Forschungszentrum Energie (29)
- Forschungsgruppe Empirische Sozialwissenschaften (24)
- Department of Engineering (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (19)
- Didaktik (mit 31.03.2021 aufgelöst; Integration ins TELL Center) (15)
Keywords
- arrayed waveguide gratings (9)
- Y-branch splitter (6)
- integrated optics (6)
- photonics (6)
- MMI splitter (5)
- AWG design (4)
- insertion loss (4)
- Arrayed waveguide gratings (3)
- Artificial Intelligence (incl. Robotics) (3)
- Crosstalk (3)
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
Tap or swipe
(2023)
In 2021, a prominent Austria dairy producer suffered from an IT attack and was completely paralysed. Without clearly defined mitigation measures in place, major disruptions were caused alongside the whole supply chain, including logistics service providers, governmental food safety bodies, as well as retailers (i.e., supermarkets and convenience stores). In this paper, we ask the question how digitisation and digital transformation impact IT security, especially when considering the complex company ecosystems of food production and food supply chains in Austria. The problem statement stems from a gap in knowledge of key differences in approaches towards IT security, resilience, risk management and especially business interfaces between food suppliers, supermarkets, distributors, logistics and other service providers. In order to answer related research questions, firstly, the authors conduct literature research, and highlight common guidelines and standardisation as well as look at state-based recommendations for critical infrastructure. In a second step, the paper describes a quantitative and qualitative survey with Austrian food companies (producers and retailers) which is described in detail in the paper. A description of recommended measures for the industry, further steps, as well as an outlook conclude the paper.
Through mandatory ESG (environmental, social, governance) reporting large companies must disclose their ESG activities showing how sustainability risks are incorporated in their decision-making and production processes. This disclosure obligation, however, does not apply to small and medium-sized enterprises (SME), creating a gap in the ESG dataset. Banks are therefore required to collect sustainability data of their SME customers independently to ensure complete ESG integration in the risk analysis process for loans. In this paper, we examine ESG risk analysis through a smart science approach laying the focus on possible value outcomes of sustainable smart services for banks as well as for their (SME) customers. The paper describes ESG factors, how services can be derived from them, targeted metrics of ESG and an ESG Service Creation Framework (business ecosystem building, process model, and value creation). The description of an exemplary use case highlighting the necessary ecosystem for service creation as well as the created value concludes the paper.
Parametric anti-resonance is a phenomenon that occurs in systems with at least two degrees of freedom; this can be achieved by periodically exciting some parameters of the system. The effect of this properly tuned periodicity is to increase the dissipation in the system, which leads to a raising in the effective damping of vibrations. This contribution presents the design of an open-loop control to reduce the settling time using the anti-resonance concept. The control signal consists of a quasi-periodic signal capable of transferring the system’s oscillations from one mode to another mode of the system. The general averaging technique is used to characterize the dynamics, particularly the so-called slow dynamics of motion. With this analysis, the control signal is designed for the potential application of a microelectromechanical sensor arrangement; for this specific example, up to 96.8% reduction of settling time is achieved.
Im vorliegenden Paper wird ein Vergleich zwischen Produktions-und Simulationsdaten präsentiert welches im Rahmen einer größeren Initiative zur Verwendung von Shopfloor Daten bei einem Projektpartner in der Automobilindustrie umgesetzt wurde. In diesem Projekt wurden die Daten die während der Füllbildsimulation entstehen mit den Daten aus der finalen Werkzeugabnahme verglichen um zu analysieren, wie genau diese miteinander über einstimmen. Je besser die Simulation ist, desto schneller kann der gesamte Werkzeugentwicklungsprozess abgewickelt werden, welcher als Kernprozess massives Einsparungspotenzial und damit Wettbewerbsvorteil mit sich bringt.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
Immersive educational spaces
(2023)
"If only we had had such opportunities to grasp history like this when I was young" – words by an almost 80-year-old woman holding an iPad on which both, the buildings in the background and a tower in the form of a virtual 3D object, appear within reach. To "grasp" history - what an apt use of this action-oriented word for an augmented reality application built on considerations of thinking and acting in history. This telling image emerged during the first test run of the app i.appear which will be the focus of this article's considerations on the use of immersive learning environments. The application i.appear has been used in the city of Dornbirn (Austria) for a year now to teach historical content through location-based augmented reality and other interactive and multimedia technologies. After a brief description of the potential of such applications, the epistemological structure of the hosting app i.appear and its functionality will be outlined. This article will focus on the “Baroque Master Builders” tour of the hosting app that was created and tested as part of the current research.
Grey Box models provide an important approach for control analysis in the Heating, Ventilation and Air Conditioning (HVAC) sector. Grey Box models consist of physical models where parameters are estimated from data. Due to the vast amount of component models that can be found in literature, the question arises, which component models perform best on a given system or dataset? This question is investigated systematically using a test case system with real operational data. The test case system consists of a HVAC system containing an energy recovery unit (ER), a heating coil (HC) and a cooling coil (CC). For each component, several suitable model variants from the literature are adapted appropriately and implemented. Four model variants are implemented for the ER and five model variants each for the HC and CC. Further, three global optimization algorithms and four local optimization algorithms to solve the nonlinear least squares system identification are implemented, leading to a total of 700 combinations. The comparison of all variants shows that the global optimization algorithms do not provide significantly better solutions. Their runtimes are significantly higher. Analysis of the models shows a dependency of the model accuracy on the number of total parameters.