Refine
Year of publication
Document Type
- Master's Thesis (113)
- Article (60)
- Conference Proceeding (20)
- Part of a Book (2)
- Working Paper (2)
- Book (1)
- Doctoral Thesis (1)
- Other (1)
- Preprint (1)
Institute
- Forschungszentrum Mikrotechnik (27)
- Forschungszentrum Business Informatics (20)
- Forschungszentrum Energie (20)
- Technik | Engineering & Technology (12)
- Department of Computer Science (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (11)
- Forschungszentrum Human Centred Technologies (8)
- Soziales & Gesundheit (6)
- Josef Ressel Zentrum für Intelligente Thermische Energiesysteme (5)
- Forschungszentrum Digital Factory Vorarlberg (4)
- Forschung (2)
Language
- English (201) (remove)
Has Fulltext
- yes (201) (remove)
Keywords
- leadership (4)
- Global optimization (3)
- Machine learning (3)
- Peripheral arterial disease (3)
- Rastrigin function (3)
- Sustainability (3)
- Switzerland (3)
- Y-branch splitter (3)
- Artificial intelligence (2)
- Bubble column humidifier (2)
In this paper, we propose and simulate a new type of three-dimensional (3D) optical splitter based on multimode interference (MMI) for the wavelength of 1550 nm. The splitter was proposed on the square basis with the width of 20 x 20 µm2 using the IP-Dip polymer as a standard material for 3D laser lithography. We present the optical field distribution in the proposed MMI splitter and its integration possibility on optical fiber. The design is aimed to the possible fabrication process using the 3D laser lithography for forthcoming experiments.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
This thesis aims to support the product development process. Therefore, an approach is developed, implemented as a prototype and evaluated, for automated solution space exploration of formally predefined design automation tasks holding the product knowledge of engineers. For this reason, a classification of product development tasks related to the representation of the mathematical model is evaluated based on the parameters defined in this thesis. In a second step, the mathematical model should be solved. A Solver is identified able to handle the given problem class.
Due to the context of this work, System Modelling Language (SysML) is chosen for the product knowledge formalisation. In the next step the given SysML model has to be translated into an object-oriented model. This translation is implemented by extracting information of a ".xml"-file using the XML Metadata Interchanging (XMI) standard. The information contained in the file is structured using the Unified Modelling Language (UML) profile for SysML. Afterwards a mathematical model in MiniZinc language is generated. MiniZinc is a mathematical modelling language interpretable by many different Solvers. The generated mathematical model is classified related to the Variable Type and Linearity of the Constraints and Objective of the generated mathematical model. The output is stored in a ".txt"-file.
To evaluate the functionality of the prototype, time consumption of the different performed procedures is measured. This data shows that models containing Continuous Variables need a longer time to be classified and optimised. Another observation shows that the transformation into an object-oriented model and the translation of this model into a mathematical representation are dependent on the number of SysML model elements. Using MiniZinc resulted in the restriction that models which use non-linear functions and Boolean Expressions cannot be solved. This is because the implementation of non-linear Solvers at MiniZinc is still in the development phase. An investigation of the optimally of the results, provided by the Solvers, was left for further work.
The Digital Factory Vorarlberg is the youngest Research Center of Vorarlberg University of Applied Sciences. In the lab of the research center a research and learning factory has been established for educating students and employees of industrial partners. Showcases and best practice scenarios for various topics of digitalization in the manufacturing industry are demonstrated. In addition, novel methods and technologies for digital production, cloud-based manufacturing, data analytics, IT- and OT-security or digital twins are being developed. The factory comprises only a minimum core of logistics and fabrication processes to guarantee manageability within an academic setup. As a product, fidget spinners are being fabricated. A webshop allows customers to individually design their products and directly place orders in the factory. A centralized SCADA-System is the core data hub for the factory. Various data analytic tools and methods and a novel database for IoT-applications are connected to the SCADA-System. As an alternative to on premise manufacturing, orders can be pushed into a cloud-based manufacturing platform, which has been developed at the Digital Factory. A broker system allows fabrication in distributed facilities and offers various optimization services. Concepts, such as outsourcing product configuration to customers or new types of engineering services in cloud-based manufacturing can be explored and demonstrated. In this paper, we present the basic concept of the Digital Factory Vorarlberg, as well as some of the newly developed topics.
Flexibility estimation is the first step necessary to incorporate building energy systems into demand side management programs. We extend a known method for temporal flexibility estimation from literature to a real-world residential heat pump system, solely based on historical cloud data. The method proposed relies on robust simplifications and estimates employing process knowledge, energy balances and manufacturer's information. Resulting forced and delayed temporal flexibility, covering both domestic hot water and space heating demands as constraints, allows to derive a flexibility range for the heat pump system. The resulting temporal flexibility lay within the range of 24 minutes and 6 hours for forced and delayed flexibility, respectively. This range provides new insights into the system's behaviour and is the basis for estimating power and energy flexibility - the first step necessary to incorporate building energy systems into demand side management programs.
A novel calorimetric technique for the analysis of gas-releasing endothermic dissociation reactions
(2020)
In engineering design, optimization methods are frequently used to improve the initial design of a product. However, the selection of an appropriate method is challenging since many
methods exist, especially for the case of simulation-based optimization. This paper proposes a systematic procedure to support this selection process. Building upon quality function deployment, end-user and design use case requirements can be systematically taken into account via a decision
matrix. The design and construction of the decision matrix are explained in detail. The proposed
procedure is validated by two engineering optimization problems arising within the design of box-type boom cranes. For each problem, the problem statement and the respectively applied optimization methods are explained in detail. The results obtained by optimization validate the use
of optimization approaches within the design process. The application of the decision matrix shows the successful incorporation of customer requirements to the algorithm selection.
Purpose: The purpose of this qualitative phenomenological study is to explore the of self-initiated expatriates prior to and during acculturation to life in a smaller periphery region such as Vorarlberg, Austria. By providing insights into their lived experience this research aims to fill in the gaps of missing information on motivators, success factors to adjustment, issues, and stressors, and more that SIEs experience when adjusting. Specifically, what items promote adjustment and what items hinder adjustment.
Findings: Developed a better understanding of how and what motivational factors lead to expatriation. Furthermore, that opportunities arise by chance. During acculturation, language factors (dialect), cultural differences act as stressors. While social support, and organizational support, learning of the language act as promoters of acculturation.
Further Research could be done including ethnicities, SIEs moving from developed to developing countries, adjustment in regions with dialect vs no dialect.
Key words: self-initiated expatriates, expatriation, acculturation, adjustment, promoting acculturation, hindering acculturation.
A rapid change to remote work during the beginning of the Covid-19 pandemic allowed many organizations to roll out new collaboration platforms to rapidly digitalize their workflows and processes in order to continue operation. This sudden shift to remote work revealed to employees the potential benefits of working remotely in the form of additional flexibility and also showed the challenges and barriers organizations could face by introducing such a strategy. This thesis aims to uncover the key considerations that the organizations of the industrial sector in Vorarlberg need to consider establishing a remote work strategy. According to the results from the research, the Covid-19 pandemic was as a paradigm change for the interviewed decision makers about how they thought about remote work and how they transformed their respective organizations too continue to operate. After the initial phase of Covid-19 restrictions organizations started to experiment with a remote work strategy of their own, based on their past experiences. For now, most of the interviewed organizations use already different remote work concepts and evaluate which one suits best their needs. The main considerations as to why an organization introduced a remote work strategy are to be an attractive employer and to stay ahead in the search for new talent. Further by introducing a remote work strategy, organizations need to change their rules of collaboration, adapt their core values to fit a remote workplace and to introduce collaboration platforms which are designed to support a remote workforce.
Creating a schedule to perform certain actions in a realworld environment typically involves multiple types of uncertainties. To create a plan which is robust towards uncertainties, it must stay flexible while attempting to be reliable and as close to optimal as possible. A plan is reliable if an adjustment to accommodate for a new requirement causes only a few disruptions. The system needs to be able to adapt to the schedule if unforeseen circumstances make planned actions impossible, or if an unlikely event would enable the system to follow a better path. To handle uncertainties, the used methods need to be dynamic and adaptive. The planning algorithms must be able to re-schedule planned actions and need to adapt the previously created plan to accommodate new requirements without causing critical disruptions to other required actions.