Refine
Year of publication
- 2023 (80) (remove)
Document Type
- Article (29)
- Master's Thesis (24)
- Conference Proceeding (17)
- Part of a Book (3)
- Preprint (3)
- Book (1)
- Doctoral Thesis (1)
- Other (1)
- Working Paper (1)
Institute
- Forschungszentrum Mikrotechnik (16)
- Forschungszentrum Business Informatics (11)
- Wirtschaft (9)
- Forschungszentrum Energie (8)
- Forschungszentrum Human Centred Technologies (7)
- Josef Ressel Zentrum für Intelligente Thermische Energiesysteme (4)
- Soziales & Gesundheit (4)
- Forschung (3)
- Technik | Engineering & Technology (3)
- Josef Ressel Zentrum für Robuste Entscheidungen (2)
Language
- English (80) (remove)
Keywords
- leadership (3)
- Machine learning (2)
- SME (2)
- agile software development (2)
- cloud computing (2)
- photonics (2)
- process management (2)
- software infrastructure (2)
- software process improvement (2)
- 3D MMI splitter (1)
In this paper, a 256-channel, 10-GHz arrayed waveguide gratings demultiplexer for ultra-dense wavelength division multiplexing was designed using an in-house developed tool called AWG-Parameters. The AWG demultiplexer was designed for a central wavelength of 1550 nm and the structure was simulated in PHASAR tool from Optiwave. Two different AWG designs were developed and the influence of the design parameters on the AWG performance was studied.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
Purpose – The purpose of this study is to explore the exogenous and endogenous drivers of the high-growth of Unicorn start-ups along their life cycle, with a particular focus on Unicorns in the fintech industry.
Design/methodology/approach – The study employs an explorative longitudinal analysis with a matched pair of two cases of Unicorns start-ups with similar antecedent features to understand holistically drivers over the longer term.
Findings – High-growth patterns over the longer term are the result of a combined industry- and company-life cycle perspective. Drivers and growth patterns vary significantly according to the time of entry in the industry and
its development status. The findings are systematised within a set of propositions to be tested in future research.
Research limitations/implications – The limitations lie in empirical evidence, as the analysis is limited to one matched-pair. The revealed Unicorns’ drivers for long-term growth might encourage future research to further investigate these drivers on a larger scale.
Practical implications – The study offers practical recommendations for start-ups with high-growth ambitions and advice to policy makers regarding the development of tailor-made support programs.
Originality/value – The study significantly extends extant work on growth and high-growth by examining endogenous and exogenous triggers over time and by linking the Unicorn-life cycle to the industry life cycle, an approach which has, to the best of the authors’ knowledge, not yet been applied.
Synthetic polymers, such as polyamide (PA), inherently possess a moderate number of surface functionalities compared to natural polymers, which negatively impacts the uniformity of metallic coatings obtained through wet-chemical methods like electroless plating. The paper presents the use of a siloxane interlayer formed from the condensation of the hydrolyzed 3-triethoxysilylpropyl succinic anhydride (TESPSA) precursor as a strategy to modify the surface properties of polyamide 6.6 (PA66) fabrics and improve the uniformity of the copper surface coating. The application of the siloxane intermediate coating demonstrates a significant improvement in electrical conductivity, up to 20 times higher than fabrics without the interlayer. The morphology of the coatings was investigated using scanning electron (SEM) and laser confocal scanning microscopy (LSM). In addition, dye adsorption, flexural rigidity, air permeability and contact angle measurements were conducted to monitor the change in the PA66 properties after the siloxane functionalization.
A rapid change to remote work during the beginning of the Covid-19 pandemic allowed many organizations to roll out new collaboration platforms to rapidly digitalize their workflows and processes in order to continue operation. This sudden shift to remote work revealed to employees the potential benefits of working remotely in the form of additional flexibility and also showed the challenges and barriers organizations could face by introducing such a strategy. This thesis aims to uncover the key considerations that the organizations of the industrial sector in Vorarlberg need to consider establishing a remote work strategy. According to the results from the research, the Covid-19 pandemic was as a paradigm change for the interviewed decision makers about how they thought about remote work and how they transformed their respective organizations too continue to operate. After the initial phase of Covid-19 restrictions organizations started to experiment with a remote work strategy of their own, based on their past experiences. For now, most of the interviewed organizations use already different remote work concepts and evaluate which one suits best their needs. The main considerations as to why an organization introduced a remote work strategy are to be an attractive employer and to stay ahead in the search for new talent. Further by introducing a remote work strategy, organizations need to change their rules of collaboration, adapt their core values to fit a remote workplace and to introduce collaboration platforms which are designed to support a remote workforce.
Creating a schedule to perform certain actions in a realworld environment typically involves multiple types of uncertainties. To create a plan which is robust towards uncertainties, it must stay flexible while attempting to be reliable and as close to optimal as possible. A plan is reliable if an adjustment to accommodate for a new requirement causes only a few disruptions. The system needs to be able to adapt to the schedule if unforeseen circumstances make planned actions impossible, or if an unlikely event would enable the system to follow a better path. To handle uncertainties, the used methods need to be dynamic and adaptive. The planning algorithms must be able to re-schedule planned actions and need to adapt the previously created plan to accommodate new requirements without causing critical disruptions to other required actions.
Scrum has been a prominent project management framework for managing software development projects. The scrum team embodies values such as commitment, focus, respect, courage, and openness to develop trust, which serves as the foundation of the scrum framework. However, in recent years, scrum teams are shifting towards a work-from-home environment which is relatively new to most of them and known to present various challenges. Looking at the benefits of adhering to scrum values, this study aims to investigate the challenges scrum teams experience in adhering to scrum values while operating virtually, as well as to explore practical strategies to overcome the identified challenges, particularly during the storming stage of team development. This research employed a qualitative methodology using semi-structured interviews with scrum team members who have experience working in a virtual environment. Through qualitative content analysis of semi-structured interviews, this research identifies significant challenges within five main categories: communication, collaboration, interpersonal dynamics, the virtual work environment, and personal workspace issues. However, beyond the challenges, the study reveals practical strategies as well for successful team dynamics and higher efficiency. The strategies derived from team members' experiences are categorized into six categories: enhanced meeting management, leveraging in-person engagements, optimizing tools & technology, effective communication strategies, team-building, and nurturing a positive work culture.
Alleviating the curse of dimensionality in minkowski sum approximations of storage flexibility
(2023)
Many real-world applications require the joint optimization of a large number of flexible devices over some time horizon. The flexibility of multiple batteries, thermostatically controlled loads, or electric vehicles, e.g., can be used to support grid operations and to reduce operation costs. Using piecewise constant power values, the flexibility of each device over d time periods can be described as a polytopic subset in power space. The aggregated flexibility is given by the Minkowski sum of these polytopes. As the computation of Minkowski sums is in general demanding, several approximations have been proposed in the literature. Yet, their application potential is often objective-dependent and limited by the curse of dimensionality. In this paper, we show that up to 2d vertices of each polytope can be computed efficiently and that the convex hull of their sums provides a computationally efficient inner approximation of the Minkowski sum. Via an extensive simulation study, we illustrate that our approach outperforms ten state-of-the-art inner approximations in terms of computational complexity and accuracy for different objectives. Moreover, we propose an efficient disaggregation method applicable to any vertex-based approximation. The proposed methods provide an efficient means to aggregate and to disaggregate typical battery storages in quarter-hourly periods over an entire day with reasonable accuracy for aggregated cost and for peak power optimization.
Vast amounts of oily wastewater are byproducts of the petrochemical and the shipping industry and to this day frequently discharged into water bodies either without or after insufficient treatment. To alleviate the resulting pollution, water treatment processes are in great demand. Bubble column humidifiers (BCHs) as part of humidification–dehumidification systems are predestined for such a task, since they are insensitive to different feed liquids, simple in design and have low maintenance requirements. While humidification in a bubble column has been investigated plentiful for desalination, a systematic investigation of oily wastewater treatment is missing in literature. We filled this gap by analyzing the treatment of an oil–water emulsion experimentally to derive recommendations for future design and operation of BCHs. Our humidity measurements indicate that the air stream is always saturated after humidification for a liquid height of only 10 cm. A residual water mass fraction of 3.5 wt% is measured after a batch run of six hours. Furthermore, continuous measurements show that an increase in oil mass fraction leads to a decrease in system productivity especially for high oil mass fractions. This decrease is caused by the heterogeneity of the liquid temperature profile. A lower liquid height mitigates this heterogeneity, therefore decreasing the heat demand and improving the overall efficiency. The oil content of the produced condensate is below 15 ppm, allowing discharge into various water bodies. The results of our systematic investigation prove suitability and indicate a strong future potential for the use of BCHs in oily wastewater treatment.
Activation of heat pump flexibilities is a viable solution to support balancing the grid via Demand Side Management measures and fulfill the need for flexibility options. Aggregators as interface between prosumers, distribution system operators and balance responsible parties face the challenge due to data privacy and technical restrictions to transform prosumer information into aggregated available flexibility to enable trading thereof. Thereby, literature lacks a generic, applicable and widely accepted flexibility estimation method for heat pumps,which incorporates reduced sensor and system information, system- and demand-dependent behaviour. In this paper, we adapt and extend a method from literature, by incorporating domain knowledge to overcome reduced sensor and system information. We apply data of five real-world heat pump systems, distinguish operation modes, estimate power and energy flexibility of each single heat pump system, proof transferability of the method, and aggregate the flexibilities available to showcase a small HP pool as a proof of concept.
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
Demand-side management approaches that exploit the temporal flexibility of electric vehicles have attracted much attention in recent years due to the increasing market penetration. These demand-side management measures contribute to alleviating the burden on the power system, especially in distribution grids where bottlenecks are more prevalent. Electric vehicles can be defined as an attractive asset for distribution system operators, which have the potential to provide grid services if properly managed. In this thesis, first, a systematic investigation is conducted for two typically employed demand-side management methods reported in the literature: A voltage droop control-based approach and a market-driven approach. Then a control scheme of decentralized autonomous demand side management for electric vehicle charging scheduling which relies on a unidirectionally communicated grid-induced signal is proposed. In all the topics considered, the implications on the distribution grid operation are evaluated using a set of time series load flow simulations performed for representative Austrian distribution grids. Droop control mechanisms are discussed for electric vehicle charging control which requires no communication. The method provides an economically viable solution at all penetrations if electric vehicles charge at low nominal power rates. However, with the current market trends in residential charging equipment especially in the European context where most of the charging equipment is designed for 11 kW charging, the technical feasibility of the method, in the long run, is debatable. As electricity demand strongly correlates with energy prices, a linear optimization algorithm is proposed to minimize charging costs, which uses next-day market prices as the grid-induced incentive function under the assumption of perfect user predictions. The constraints on the state of charge guarantee the energy required for driving is delivered without failure. An average energy cost saving of 30% is realized at all penetrations. Nevertheless, the avalanche effect due to simultaneous charging during low price periods introduces new power peaks exceeding those of uncontrolled charging. This obstructs the grid-friendly integration of electric vehicles.
Beyond the Four-Level Model: Dark and Hot States in Quantum Dots Degrade Photonic Entanglement
(2023)
Entangled photon pairs are essential for a multitude of quantum photonic applications. To date, the best performing solid-state quantum emitters of entangled photons are semiconductor quantum dots operated around liquid-helium temperatures. To favor the widespread deployment of these sources, it is important to explore and understand their behavior at temperatures accessible with compact Stirling coolers. Here we study the polarization entanglement among photon pairs from the biexciton–exciton cascade in GaAs quantum dots at temperatures up to ∼65 K. We observe entanglement degradation accompanied by changes in decay dynamics, which we ascribe to thermal population and depopulation of hot and dark states in addition to the four levels relevant for photon pair generation. Detailed calculations considering the presence and characteristics of the additional states and phonon-assisted transitions support the interpretation. We expect these results to guide the optimization of quantum dots as sources of highly entangled photons at elevated temperatures.
Pooled data from published reports on infants with clinically diagnosed vitamin B12 (B12) deficiency were analyzed with the purpose of describing the presentation, diagnostic approaches, and risk factors for the condition to inform prevention strategies. An electronic (PubMed database) and manual literature search following the PRISMA approach was conducted (preregistration with the Open Science Framework, accessed on 15 February 2023). Data were described and analyzed using correlation analyses, Chi-square tests, ANOVAs, and regression analyses, and 102 publications (292 cases) were analyzed. The mean age at first symptoms (anemia, various neurological symptoms) was four months; the mean time to diagnosis was 2.6 months. Maternal B12 at diagnosis, exclusive breastfeeding, and a maternal diet low in B12 predicted infant B12, methylmalonic acid, and total homocysteine. Infant B12 deficiency is still not easily diagnosed. Methylmalonic acid and total homocysteine are useful diagnostic parameters in addition to B12 levels. Since maternal B12 status predicts infant B12 status, it would probably be advantageous to target women in early pregnancy or even preconceptionally to prevent infant B12 deficiency, rather than to rely on newborn screening that often does not reliably identify high-risk children.
Digitalization is changing business models and operational processes. At the same time, improved data availability and powerful analytical methods are influencing controlling and increasingly require the use of statistical and information technology skills and knowledge. Using a case study from marketing controlling, the article shows the use of business analytics methods and addresses the tasks of controlling in the digital age.
Whether at the intramolecular or cellular scale in organisms, cell-cell adhesion adapt to external mechanical cues arising from the static environment of cells and from dynamic interactions between neighboring cells. Cell-cell adhesions need to resist detachment forces to secure the integrity and internal organization of organisms. In the past, various techniques have been developed to characterize adhesion properties of molecules and cells in vitro, and to understand how cells sense and probe their environment. Atomic force microscopy and dual-pipette aspiration, where cells are mainly present in suspension, are common methods for studying detachment forces of cell-cell adhesions. How cell-cell adhesion forces are developed for adherent and environment-adapted cells, however, is less clear. Here, we designed the Cell-Cell Separation Device (CC-SD), a microstructured substrate that measures both intercellular forces and external stresses of cells towards the matrix. The design is based on micropillar arrays originally designed for cell traction-force measurements. We designed PDMS micropillar-blocks, to which cells could adhere and be able to connect to each other across the gap. Controlled stretching of the whole substrate changed the distance between blocks and increased gap size. That allowed us to apply strains to cell-cell contacts, eventually leading to cell-cell adhesion detachment, which was measured by pillar deflections. The CC-SD provided an increase of the gap between the blocks of up to 2.4-fold, which was sufficient to separate substrate-attached cells with fully developed F-actin network. Simultaneously measured pillar deflections allowed us to address cellular response to the intercellular strain applied. The CC-SD thus opens up possibilities for the analysis of intercellular force detachments and sheds light on the robustness of cell-cell adhesions in dynamic processes in tissue development.
Design, simulation, and optimization of the 1×4 optical three-dimensional multimode interference splitter using IP-Dip polymer as a core and polydimethylsiloxane (PDMS) Sylgard 184 as a cladding is demonstrated. The splitter was simulated by using beam propagation method in BeamPROP simulation module of RSoft photonic tool and optimized for an operating wavelength of 1.55 μm . According to the minimum insertion loss, the dimensions of the splitter were optimized for a waveguide with a core size of 4×4 μm2 . The objective of the study is to create the design for fabrication by three-dimensional direct laser writing optical lithography.
We present design of planar 16-channel, 100-GHz multi-mode polymer-based AWG. This AWG was designed for central wavelength of 1550 nm applying AWG-Parameters tool. The AWG structure was created and simulated in the commercial photonic tool PHASAR from Optiwave. Achieved transmission characteristics were evaluated by AWG-Analyzer tool. For the design, multi-mode waveguides having a cross-section of (4x4) µm2 were used. The simulated results show strong worsening of the transmission characteristics in comparison when using single-mode waveguides. Nevertheless, the transmitting channels are clearly separated. The reason for using thicker multi-mode waveguides in the design is possibility to fabricate the AWG structure on polymer basis using direct laser writing lithography.
Having autonomy in the workplace can have a positive impact on employees’ performance, which in turn can benefit the organization’s competitive advantages. While previous researches have primarily focused on the psychological effects of job autonomy on employee performance and has been limited to certain domains, the relationship between job autonomy and organizational design is an important area of study for organizations seeking to improve their competitiveness. This thesis proposes a conceptual model for designing an organization structure that promotes employee performance in manufacturing companies by removing obstacles towards obtaining job autonomy. The focus is on ambitious employees who seek growth and development opportunities within their organization. The model is based on a review of existing literature on job autonomy and organizational design. Exploratory qualitative research was conducted with selected ambitious employees from different industries by means of one-on-one semi-structured interviews. Overall, the proposed model has practical implications for manufacturing companies looking to motivate their employees, as well as for researchers seeking to advance their understanding of organizational design in our times.
Digital twin as enabler of business model innovation for infrastructure construction projects
(2023)
Emerging technologies and methods are becoming an important element of the construction industry. Digital Twins are used as a base to store data in BIM models and make use out of the data respectively make the data visible. The transparency in all phases of the lifecycle of building and infrastructure assets is crucial in order to get a more efficient lifecycle of planning, construction and maintenance. Whereas other industries increased performance in these phases by making use out of the data, construction industry is stuck in traditional methods and business models. In this paper we propose a concept that focuses on the digital production twin. The comparison of planning data with As-Is production data can empower a data driven continuous improvement process and support the decision making process of future innovations and suitable business models. This paper outlines the possibility to use the data stored in a digital twin with regards to the evaluation of possible business models.
Lack of transparency and traceability of products and their raw materials means that most products can only be thrown away or not properly recycled due to a lack of relevant data. This conflicts with the circular economy principles, which are demanded by several initiatives, including the European Union. The aim of this master thesis is to analyze this conflict and to propose a technical solution based on Distributed Ledger Technology that enables transparency and traceability of products and their materials. Therefore, the thesis addresses two central research questions: 1. How can traceability and transparency be enabled by integrating a DLT solution? 2. How would a prototype with the integration of smart contracts and DLT look like? To answer these questions, a blockchain solution is implemented using Hyperledger Fabric. The solution uses the immutability and decentralized nature of DLT to record and track the movement of products and their materials throughout their life cycle in the Circular Economy. Furthermore, with private data collections, confidentiality, and privacy are granted while ensuring transparency. The thesis contributes to the Circular Economy field by exploring the principles, models, and challenges of the Circular Economy and the circularity goals of a Digital Product Passport to develop a suitable technical solution. The chosen blockchain framework, Hyperledger Fabric, is presented, and its key components and features are highlighted. The thesis also delves into the design decisions and considerations behind the Digital Product Passport platform, explaining the architecture and transaction flow together with the prototype implementation and demonstration to showcase the functionality of the solution. Results and analysis provide insights into the challenges of the Circular Economy, sustainable resource management, and the Digital Product Passport, resulting in recommendations for future improvements and enhancements. Overall, this thesis offers a practical solution utilizing DLT to enable transparency and traceability in the Circular Economy, contributing to the realization of sustainable and efficient resource management practices to ultimately contribute to the set Circular Economy initiatives.
A step change is needed in the deployment of renewable energy if the triple challenge of ensuring climate change mitigation, energy security, and energy affordability is to be met. Yet, social acceptance of infrastructure projects and policies remains a key concern. While there has been decades of fruitful research on the social acceptance of wind energy and other renewables, much of the extant research is cross-sectional in nature, failing to capture the important dynamic processes that can make or break renewable energy projects. This paper introduces a Special Issue of Energy Policy which focuses on the neglected topic of the dynamics of social acceptance of renewable energy, drawing on contributions made at an international research conference held in St. Gallen (Switzerland) in June 2022. In addition to introducing these papers and drawing out common themes, we also seek to offer some conceptual clarity on the issue of dynamics in social acceptance, taking into account the influence of time, power, and scale in shaping decision-making processes. We conclude by highlighting a number of avenues of potential future research.
Background: Cardiovascular disease is the major cause of death worldwide. Although knowledge regarding diagnosing and treating cardiovascular disease has increased dramatically, secondary prevention remains insufficiently implemented due to failure among affected individuals to adhere to guideline recommendations. This has continued to lead to high morbidity and mortality rates. Involving patients in their healthcare and facilitating their active roles in their chronic disease management is an opportunity to meet the needs of the increasing number of cardio-vascular patients. However, simple recall of advice regarding a more preventive lifestyle does not affect sustainable behavioral lifestyle changes. We investigate the effect of plaque visualization combined with low-threshold daily lifestyle tasks using the smartphone app PreventiPlaque to evaluate change in cardiovascular risk profile. Methods: and study design: This randomized, controlled clinical trial includes 240 participants with ultrasound evidence of atherosclerotic plaque in one or both carotid arteries, defined as focal thickening of the vessel wall measuring 50% more than the regular vessel wall. A criterion for participation is access to a smartphone suitable for app usage. The participants are randomly assigned to an intervention or a control group. While both groups receive the standard of care, the intervention group has additional access to the PreventiPlaque app during the 12-month follow-up. The app includes daily tasks that promote a healthier lifestyle in the areas of smoking cessation, medication adherence, physical activity, and diet. The impact of plaque visualization and app use on the change in cardiovascular risk profile is assessed by SCORE2. Feasibility and effectiveness of the PreventiPlaque app are evaluated using standardized and validated measures for patient feedback.
Effective lead management
(2023)
In the last few years the global interest on lead management has increased. This classic topic for marketing and sales departments is aimed at converting potential customers into sales. The following thesis identifies the challenges and solutions for marketing and sales departments in order to process effective lead management. Using data from a literature review and qualitative empirical research, conducted with representatives of marketing and sales departments, the results showed overall and task specific challenges and solutions. The research indicates that overall challenges and solutions regarding the gap between marketing and sales, new processes and data management including data quality, software and silos emerge. In addition task specific challenges and solutions concerning lead generation including purchased leads, lead qualification, lead nurturing and sales specific challenges and solutions conclusively the focus on existing customers, time famine and lead routing were identified. This thesis provides a framework for further studies regarding the challenges and solutions for marketing and sales departments processing lead management.
The Fast Average Current Mode control methodology is a novel method for the implementation of a current compensator in a switched-mode power supply. It does not require compensation against sub-harmonic instability and is inductor independent. In this work, the digital implementation of this topology is compared against an analog implementation using simulation. Additionally, a hardware prototype is created to validate the digital simulation's results. In a Simulink environment, parameters of the digital implementation, such as the digital-to-analog converter resolutions and the delay counter frequency are varied to research their impact on system performance. The simulations show that a digital current compensator has similar performance as an analog implementation when designed tailored to the application. When evaluating the whole control loop the digital system is inferior due to added delays caused by digital to analog conversion. By operating the Buck converter hardware implementation as a current source, the functionality of the current mode control implementation in a FPGA was proven. Voltage control cannot be validated due to hardware issues. Due to the successful simulation of the source code with a mixed signal model of the converter, it can be assumed that it is functional. Apart from performance, a digital implementation shows many benefits compared to an analog solution, such as configurability of control parameters and easy compensation of component variations and aging.
This thesis evaluates the feasibility of conducting visual inspection tests on power industry constructions using object detection techniques. The introduction provides an overview of this field’s state-of-the-art technologies and approaches. For the implementation, a case study is then conducted, which is done in collaboration with the partner company OMICRON Electronics GmbH, focusing on power transformers as an example. The objective is to develop an inspection test using photographs to identify power transformers and their subcomponents and detect existing rust spots and oil leaks within these components. Three object detection models are trained: one for power transformers and sub-components, one for rust detection, and one for oil leak detection. The training process utilizes the implementation of the YOLOv5 algorithm on a Linux-based workstation with an NVIDIA Quadro RTX 4000 GPU. The power transformer model is trained on a dataset provided by the partner company, while open-source datasets are used for rust and oil leak detection. The study highlights the need for a more powerful GPU to enhance training experiments and utilizes an Azure DevOps Pipeline to optimize the workflow. The performance of the power transformer detection model is satisfactory but influenced by image angles and an imbalance of certain sub-components in the dataset. Multi-angle video footage is a proposed solution for the inspection test to address this limitation and increase the size of the dataset, focusing on reducing the imbalance. The models trained on open-source datasets demonstrate the potential for rust and oil leak detection but lack accuracy due to their generic nature. Therefore, the datasets must be adjusted with case-specific data to achieve the desired accuracy for reliable visual inspection tests. The results of the case study have been well-received by the partner company’s management, indicating future development opportunities. This case study will likely be a foundation for implementing visual inspection tests as a product.
Hot water heat pumps are well suited for demand side management, as the heat pump market faced a rapid growth in the past years with the trend to decentralized domestic hot water use. Sales were accelerated through wants and needs of energy conservation, energy efficiency, and less restrictive rules regarding Legionella. While in literature the model predictive control potential for heat pumps is commonly shown in simulations, the share of experimental studies is relatively low. To this day, experimental studies considering solely domestic hot water use are not available. In this paper, the realistic achievable model predictive control potential of a hot water heat pump is compared to the standard hysteresis control, to provide an experimental proof. We show for the first time, how state-of-the-art approaches (model predictive control, system identification, live state estimation, and demand prediction) can be transferred from electric hot water heaters to hot water heat pumps, combined, and implemented into a real-world hot water heat pump setup. The optimization approach, embedded in a realistic experimental setting, leads to a decrease in electric energy demand and cost per unit electricity by approximately 12% and 14%, respectively. Further, an increase in efficiency by approximately 13% has been achieved.
Highly-sensitive single-step sensing of levodopa by swellable microneedle-mounted nanogap sensors
(2023)
Microneedle (MN) sensing of biomarkers in interstitial fluid (ISF) can overcome the challenges of self-diagnosis of diseases by a patient, such as blood sampling, handling, and measurement analysis. However, the MN sensing technologies still suffer from poor measurement accuracy due to the small amount of target molecules present in ISF, and require multiple steps of ISF extraction, ISF isolation from MN, and measurement with additional equipment. Here, we present a swellable MN-mounted nanogap sensor that can be inserted into the skin tissue, absorb ISF rapidly, and measure biomarkers in situ by amplifying the measurement signals by redox cycling in nanogap electrodes. We demonstrate that the MN-nanogap sensor measures levodopa (LDA), medication for Parkinson disease, down to 100 nM in an aqueous solution, and 1 μM in both the skin-mimicked gelatin phantom and porcine skin.
Immersive educational spaces
(2023)
"If only we had had such opportunities to grasp history like this when I was young" – words by an almost 80-year-old woman holding an iPad on which both, the buildings in the background and a tower in the form of a virtual 3D object, appear within reach. To "grasp" history - what an apt use of this action-oriented word for an augmented reality application built on considerations of thinking and acting in history. This telling image emerged during the first test run of the app i.appear which will be the focus of this article's considerations on the use of immersive learning environments. The application i.appear has been used in the city of Dornbirn (Austria) for a year now to teach historical content through location-based augmented reality and other interactive and multimedia technologies. After a brief description of the potential of such applications, the epistemological structure of the hosting app i.appear and its functionality will be outlined. This article will focus on the “Baroque Master Builders” tour of the hosting app that was created and tested as part of the current research.
In the era of digital transformation an evolution takes place. Following this, new perspectives concerning leadership are required, especially in virtual teams. Shared Leadership is a promising leadership form to meet the challenges in a virtual team setting. Particularly, studies show that shared leadership increases performance, team creativity and innovative behavior. Moreover, the responsibility is distributed among several, not one individual. Nevertheless, it is unclear, which skills are needed in shared leadership teams and how they could be trained. Therefore, we develop a conceptual framework to pave the way for an empirical inquiry of the skills for and the role of shared leadership. Moreover, we encourage the discussion, whether the current leadership development is still viable and offer practical implications to develop shared leadership.
In 2021, a prominent Austria dairy producer suffered from an IT attack and was completely paralysed. Without clearly defined mitigation measures in place, major disruptions were caused alongside the whole supply chain, including logistics service providers, governmental food safety bodies, as well as retailers (i.e., supermarkets and convenience stores). In this paper, we ask the question how digitisation and digital transformation impact IT security, especially when considering the complex company ecosystems of food production and food supply chains in Austria. The problem statement stems from a gap in knowledge of key differences in approaches towards IT security, resilience, risk management and especially business interfaces between food suppliers, supermarkets, distributors, logistics and other service providers. In order to answer related research questions, firstly, the authors conduct literature research, and highlight common guidelines and standardisation as well as look at state-based recommendations for critical infrastructure. In a second step, the paper describes a quantitative and qualitative survey with Austrian food companies (producers and retailers) which is described in detail in the paper. A description of recommended measures for the industry, further steps, as well as an outlook conclude the paper.
International Entrepreneurship explains the opportunities and challenges facing internationalizing entrepreneurial ventures. The book inlcudes a thorough discussion of fundamentals as well as contemporary research findings. Numerous cases, featuring diverse contexts, illustrate theory and help classroom use.
The implementation of direct-to-consumer (D2C) business models has become more important for companies trying to develop a competitive edge and improve consumer engagement in today's rapidly expanding e-commerce market. This master's thesis investigates the important success elements and problems of deploying D2C models in the e-commerce business. The research question focuses on identifying the factors that contribute to the successful transition to D2C models and the obstacles businesses encounter along the way. Through qualitative research using the Eisenhardt method and in-depth case studies with industry experts, this study provides valuable insights into key success factors for direct-to-consumer (D2C) business models in e-commerce.The findings highlight that businesses that effectively implement D2C models utilize key success factors such as a clear value proposition, customer engagement and relationship build- ing, seamless online experiences, targeted marketing and digital advertising, brand identity and storytelling, and flexibility and adaptability. However, they also face challenges related to operational adjustments, marketing and branding investments, competition, and market saturation. Based on these research outcomes, this thesis provides recommendations for businesses seeking to switch to or implement D2C models in e-commerce. These recommendations emphasize embracing a customer-centric mindset, developing digital capabilities, foster- ing strong leadership commitment, leveraging data and analytics, establishing direct customer relationships, optimizing operational processes, building brand trust and credibility, and allocating resources wisely. This master's thesis provides a comprehensive analysis of the key success factors and challenges associated with the transition to or implementation of D2C business models in the e-commerce industry. It provides advice to help companies successfully transition to D2C models.
This study aims to address the research gap surrounding the role of leadership in the formation of high-performance teams within startup companies. While there is existing research on high-performing teams, limited attention has been given to leadership in this environment. To bridge this gap, the study combines a literature review and qualitative analysis through semi-structured interviews with diverse stakeholders in startups, with the goal of providing practical guidance for startup executives based on the research findings. The study uncovers key aspects of leadership in high-performance teams, emphasizing the importance of skills such as motivation and support for team members, fostering psychological safety and trust, and effectively managing uncertainty. In addition to resource constraints and high expectations, the study sheds light on the challenges faced by leaders in startup and high-performance team environments, particularly the blurring of traditional leadership roles as team members seek autonomy and decision-making authority. These findings present opportunities for future research to explore this progressive leadership style. Overall, this study contributes to our understanding of leadership dynamics within high-performance teams operating in the context of startups. It offers valuable insights that can help startup executives navigate the complexities of leadership and foster the development of successful and high-performing teams.
This thesis investigates the role of leadership behaviours of C-level executives in the context of post-M&A integration processes. The primary focus is on understanding the impact of specific leadership behaviours on inspiring desirable follower effects and facilitating emotional acceptance during organizational change. Drawing on the frameworks presented in “Six- Dimension Integrative Model of Leadership” and "The Six Domains of Leadership" developed by Sitkin et al., the study conducts expert interviews with managers from middle management who have recently experienced M&A integration. The answers are analysed in depth to identify the most effective leadership behaviours, highlighting those mentioned most frequently and those capable of triggering multiple follower effects simultaneously. The result is a list of behaviours that can serve as a guideline for C-level executives who want to foster desirable follower effects throughout the M&A integration journey.
Although workplace climate has been already extensively studied, the research has not led to firm conclusions regarding leadership trainings referring to the awareness of psychological safety in a company and its influence on existing teams and the general work climate. The author used the already existing model of Carr, Schmidt, Ford, & DeShon (2003) and adjusted it with psychological safety as 4th climate item to develop hypothesen which can also be seen as a path analytic model. The model posied that climate affects individual level outcomes through its impact on cognitive and affective states. Therefore, the author wants to show the correlation between the 4 higher order facets of climate affect the individual levels of job performance, psychological well-being and withdrawal through their impact on orangizational commitment and job saitsfaction (Carr, Schmidt, Ford, & DeShon, 2003).
In this work, parametric excitation is introduced in a fully balanced flexible rotor mounted on two identical active gas foil bearings. The active gas foil bearings change the top foil shape harmonically with a specific amplitude and frequency. The deformable foil shape is approximated by an analytical function, while the gas pressure distribution is evaluated by the numerical solution of the Reynolds equation for compressible flow. The harmonic variation of the foil shape generates a respective variation in the bearings’ stiffness and damping properties and the system experiences parametric resonances and antiresonances in specific excitation frequencies. The nonlinear gas bearing forces generate bifurcations in the solutions of the system at certain rotating speeds and excitation frequencies; period doubling and Neimark-Sacker bifurcations are noticed in the examined system, and their progress is evaluated as the two bifurcation parameters (rotating speed and parametric excitation frequency) are changed, though a codimension-2 numerical continuation of limit cycles. It is found that at specific range of excitation frequency there are parametric anti-resonances and the bifurcations collide and vanish. Therefore, a bifurcation-free operating range is established and the system can operate stable at a wide speed range.
This thesis focuses on implementing and testing communication over a private 5G standalone network in an industrial environment, with a specific emphasis on communication between two articulated robots. The main objective is to examine machine-to-machine communication behavior in various test scenarios. Initially, the 5G core and radio access network components are described, along with their associated interfaces, to establish foundational knowledge. Subsequently, a use case involving two articulated robots is implemented, and essential metrics are defined for testing, including round-trip time, packet and inter-packet delay, and packet error rate. The tests investigate the impact of 5G quality of service, packet size, and transmission interval on communication between the robots, focusing on the effects of network traffic. The results highlight the significance of prioritizing network resources based on the assigned quality of service identifier (5QI), demonstrate the influence of packet sizes on communication performance, and underscore the importance of transmission intervals for automation purposes. Additionally, the study examines how network disturbances influence the movements of a robot controlled via 5G, establishing a direct relationship between network metrics and the resulting deviations in the robot’s trajectory. The work concludes that while machine-to-machine communication can be successfully implemented with 5G SA, tradeoffs must be carefully considered, especially concerning packet error rate, and emphasizes the importance of understanding the required resources before implementation to ensure feasibility. Future research directions include investigating network slicing, secure remote control of robots, and exploring the use of higher frequency bands. The study highlights the significance of aligning theoretical standards with practical implementation options in the evolving landscape of 5G Networks.
Purpose: In this thesis the viable system model (VSM) is used as a framework to develop a model for the management of a business alliance that contains the necessary and sufficient conditions for maintaining synergy of its constituent organisations and for adapting to a changing environment so that it can remain a long-term viable alliance. In addition, a model is developed that makes explicit the inherent link between the VSM and the core elements of knowledge management theory. Based then on the alliance management model and the link established between the VSM and knowledge management, an application framework is developed to guide practitioners in defining necessary alliance management functions and relationships, the knowledge required by that management to fulfill those functions, and the processes that need to be in place to manage that knowledge. Design/strategy: The research has been divided into four phases: theoretical construction, refinement with practitioners, real-world application, and evaluation of test case and toolset. The researcher has worked closely with practitioners actively involved in the formation of a new international alliance to develop a VSM model and application framework for the alliance management. Formally, the research strategy has been defined as an action research and the research philosophy as one of pragmatism. Findings/limitations: The developed application framework, has been successfully used to identify absent and incomplete roles, actions, and interactions within the management of the specific alliance test case. This has helped to demonstrate how the application framework and VSM model can be used to diagnose and, most importantly, to articulate and visualise management deficiencies to facilitate clear and unambiguous discussions. The timing of this cross-sectional research did not allow the application framework to be utilised from the outset of the alliance formation as an organisational planning tool and also not to its full extent to support the development of knowledge processes for the alliance management. However, the step-by-step approach used in developing the toolset and then explaining its application will allow the reader to judge its credability and generalisability for other practical applications. Practical implications: The developed toolset consists of a VSM for an alliance management, job descriptions for that management (responsibilities, interfaces, and core competencies), a visual model illustrating the link between the VSM and knowledge management, and an application framework to guide the filling of the alliance management job descriptions in phases of recruitment, onboarding, and development (of interfaces and activities processes). Overall, one could say that the conditions prescribed by the VSM are rather obvious and yet, as seen by the specific alliance test case, many of these conditions have been completely overlooked by a management that was more than capable, willing, and empowered to enact those conditions. This gives a good indication that the toolset which has been compiled in a visual and tabular systematic fashion may well be useful to practitioners for the organisational planning of an alliance management. The visual representation of a management role in the VSM as a set of knowledge episodes put forward by this research is significant. It forces the express recognition that knowledge management is an integral part of every interaction that takes place and every action performed that, according to the VSM, are necessary and altogether are sufficient for viability. It means that knowledge management cannot be considered as some abstract topic or unnecessary overhead or afterthought – it is entirely necessary, practical and forms a natural course of events during design of action/interaction processes. In other words, if an organisation is viable then, by definition, it does knowledge management whether or not it is formally recognised as such. The VSM, by defining necessary and sufficient actions and interactions for its roles, therefore provides a focus for relevant knowledge and serves as a tool for structured knowledge management. Originality/value: This research addresses a general academic call for hands-on insights of VSM applications by sharing real-world insights, artifacts and reflections generated by a practical and relevant organisational management application. It also addresses the potential, recognised by academics, for VSM as a framework for knowledge management by developing an intuitive model linking those theories and then using that model as part of a framework to guide its application. The introduction to aspects of knowledge management theory relevant to the model developed as well as the meticulousness and comprehensive explanation of the VSM provides a solid theoretical foundation for practitioners. The developed toolset is based on existing theories from multiple fields of research that have been logically linked and extended in an original and novel manner with a strong focus on practical application. This researcher’s hope is that this will stimulate interest for future research and practical application from academics and practitioners alike.
The production of liquid-gas mixtures with desired properties still places high demands on process technology and is usually realized in bubble columns. The physical calculation models used have individual dimensionless factors which, depending on the application, are only valid for small ranges consisting of flow velocity, nozzle geometry and test setup. An iterative but time-consuming design of such dispersion processes is used in industry for producing a liquid-gas mixture according to desired requirements. In the present investigation, we accelerate the necessary design loops by setting up a physical model, which consists of several subsystems that are enriched by dedicated experiments to realize liquid-gas dispersions with low volume fraction and small air bubble diameters in oil. Our approach allows the extraction of individual dimensionless factors from maps of the introduced subsystems. These maps allow for targeted corrective measures of a production process for keeping the quality. The calculation-based approach avoids the need for performing iterative design loops. Overall, this approach supports the controlled generation of liquid-gas mixtures.
Supply shortages faced in products and resources from semiconductors to natural gas in recent years have had impact massive on global economy, but such challenges are not new for supply chain professionals. Many major events in the past have disrupted supply chains: 9/11 attack in New York, Tsunami in Japan to name a few, but COVID19 have had the biggest and widespread impact in the modern times. Even though supply chain resilience being a term coined in early 2000’s, its usage and importance has increased since then. With the curiosity of assessing the current state of sup-ply chain resilience literature and finding a resilience measurement method which is a one-fit for all supply chains in the manufacturing industry of Vorarlberg, the following research project was undertaken. Research is carried out with mixed methods, using a systematic literature review followed by expert interviews. In the conclusion of the research the author argues that there is a significant difference in the understanding of the term resilience within industry, there is a lack on the need for a meas-ure for resilience. The ways in which the structure of an organization impacts the level of resilience, foreseen benefits of digitalization and technologies for resilience are also dis-cussed. A comparative analysis on the SCR measurement methods discovered in literature, resulted in recommending Resilience index for on-time delivery proposed by Carvalho et al for the mentioned industry.