Refine
Year of publication
Document Type
- Conference Proceeding (307)
- Article (279)
- Part of a Book (53)
- Book (19)
- Doctoral Thesis (9)
- Report (6)
- Working Paper (4)
- Other (3)
- Periodical (3)
- Part of Periodical (3)
Institute
- Forschungszentrum Mikrotechnik (235)
- Forschungszentrum Business Informatics (149)
- Technik | Engineering & Technology (127)
- Department of Computer Science (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (112)
- Wirtschaft (105)
- Forschungszentrum Energie (77)
- Didaktik (mit 31.03.2021 aufgelöst; Integration ins TELL Center) (37)
- Forschungszentrum Human Centred Technologies (35)
- Soziales & Gesundheit (33)
- Josef Ressel Zentrum für Materialbearbeitung (27)
- Forschungszentrum Digital Factory Vorarlberg (14)
- Department of Engineering (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (12)
- Forschungsgruppe Empirische Sozialwissenschaften (12)
- Forschung (10)
- Josef Ressel Zentrum für Robuste Entscheidungen (8)
- Gestaltung (6)
- Josef Ressel Zentrum für Intelligente Thermische Energiesysteme (5)
Language
- English (687) (remove)
Is part of the Bibliography
- yes (687) (remove)
Keywords
- Laser ablation (11)
- Y-branch splitter (11)
- arrayed waveguide gratings (11)
- photonics (8)
- Evolution strategy (7)
- Demand side management (6)
- Optimization (6)
- integrated optics (6)
- Arrayed waveguide gratings (5)
- Evolution Strategies (5)
The utilization of lasers in dentistry expands greatly in recent years. For instance, fs-lasers are effective for both drilling and caries prevention, while cw-lasers are useful for adhesive hardening. A cutting-edge application of lasers in dentistry is the debonding of veneers. While there are pre-existing tools for this purpose, there is still potential for improvement. Initial efforts to investigate laser assisted debonding mechanisms with measurements of the optical and mechanical properties of teeth and prosthetic ceramics are presented. Preliminary tests conducted with a laser system used for debonding that is commercially available showed differences in the output power set at the systems console to that at specified distances from the handpiece. Furthermore, the optical properties of the samples (human teeth and ceramics) were characterised. The optical properties of the ceramics should closely resemble those of teeth in terms of look and feel, but they also influence the laser assisted debonding technique and thus must be taken into account. In addition first attempts were performed to investigate the mechanical properties of the samples by means of pump-probe-elastography under a microscope. By analyzing the sample surface up to 20 ns after a fs-laser pulse impact, pressure and shock waves could be detected, which can be utilized to determine the elastic constants of specific materials. Together such investigations are needed to shape the basis for a purely optical approach of debonding of veneers utilizing acoustic waves.
In this paper, we consider the question of data aggregation using the practical example of emissions data for economic activities for the sustainability assessment of regional bank clients. Given the current scarcity of company-specific emission data, an approximation relies on using available public data. These data are reported in different standards in different sources. To determine a mapping between the different standards, an adaptation to the Covariance Matrix Self-Adaptation Evolution Strategy is proposed. The obtained results show that high-quality mappings are found. Nevertheless, our approach is transferable to other data compatibility problems. These can be found in the merging of emissions data for other countries, or in bridging the gap between completely different data sets.
This study presents different approaches to increase the sensing area of NiO based semiconducting metal oxide gas sensors. Micro- and nanopatterned laser induced periodic surface structures (LIPSS) are generated on silicon and Si/SiO2 substrates. The surface morphologies of the fabricated samples are examined by FE SEM. We select the silicon samples with an intermediate Si3N4 layer due to its superior isolation quality over the thermal oxide for evaluating the hydrogen and acetone sensitivity of a NiO based test sensor.
Objectives: The MetabQoL 1.0 is the first disease-specific health related quality of life (HrQoL) questionnaire for patients with intoxication-type inherited metabolic disorders. Our aim was to assess the validity and reliability of the MetabQoL 1.0, and to investigate neuropsychiatric burden in our patient population. Methods: Data from 29 patients followed at a single center, aged between 8 and 18 years with the diagnosis of methylmalonic acidemia (MMA), propionic acidemia (PA) or isovaleric acidemia (IVA), and their parents were included. The Pediatric Quality of Life Inventory (PedsQoL) was used to evaluate the validity and reliability of MetabQoL 1.0.
Results: The MetabQoL 1.0 was shown to be valid and reliable (Cronbach's alpha: 0.64–0.9). Fourteen out of the 22 patients (63.6%) formally evaluated had neurological findings. Of note, 17 out of 20 patients (85%) had a psychiatric disorder when evaluated formally by a child and adolescent psychiatrist. The median mental scores of the MetabQoL 1.0 proxy report were significantly higher than those of the self report (p = 0.023). Patients with neonatal-onset disease had higher MetabQoL 1.0 proxy physical (p = 0.008), mental (p = 0.042), total scores (p = 0.022); and self report social (p = 0.007) and total scores (p = 0.043) than those with later onset disease.
Conclusions: This study continues to prove that the MetabQoL 1.0 is an effective tool to measure what matters in intoxication-type inherited metabolic disorders. Our results highlight the importance of clinical assessment complemented by patient reported outcomes which further expands the evaluation toolbox of inherited metabolic diseases.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
Highly-sensitive single-step sensing of levodopa by swellable microneedle-mounted nanogap sensors
(2023)
Microneedle (MN) sensing of biomarkers in interstitial fluid (ISF) can overcome the challenges of self-diagnosis of diseases by a patient, such as blood sampling, handling, and measurement analysis. However, the MN sensing technologies still suffer from poor measurement accuracy due to the small amount of target molecules present in ISF, and require multiple steps of ISF extraction, ISF isolation from MN, and measurement with additional equipment. Here, we present a swellable MN-mounted nanogap sensor that can be inserted into the skin tissue, absorb ISF rapidly, and measure biomarkers in situ by amplifying the measurement signals by redox cycling in nanogap electrodes. We demonstrate that the MN-nanogap sensor measures levodopa (LDA), medication for Parkinson disease, down to 100 nM in an aqueous solution, and 1 μM in both the skin-mimicked gelatin phantom and porcine skin.
Organic acidurias (OAs), urea-cycle disorders (UCDs), and maple syrup urine disease (MSUD) belong to the category of intoxication-type inborn errors of metabolism (IT-IEM). Liver transplantation (LTx) is increasingly utilized in IT-IEM. However, its impact has been mainly focused on clinical outcome measures and rarely on health-related quality of life (HRQoL). Aim of the study was to investigate the impact of LTx on HrQoL in IT-IEMs. This single center prospective study involved 32 patients (15 OA, 11 UCD, 6 MSUD; median age at LTx 3.0 years, range 0.8–26.0). HRQoL was assessed pre/post transplantation by PedsQL-General Module 4.0 and by MetabQoL 1.0, a specifically designed tool for IT-IEM. PedsQL highlighted significant post-LTx improvements in total and physical functioning in both patients' and parents' scores. According to age at transplantation (≤3 vs. >3 years), younger patients showed higher post-LTx scores on Physical (p = 0.03), Social (p < 0.001), and Total (p =0.007) functioning. MetabQoL confirmed significant post-LTx changes in Total and Physical functioning in both patients and parents scores (p ≤ 0.009). Differently from PedsQL, MetabQoL Mental (patients p = 0.013, parents p = 0.03) and Social scores (patients p = 0.02, parents p = 0.012) were significantly higher post-LTx. Significant improvements (p = 0.001–0.04) were also detected both in self- and proxy-reports for almost all MetabQoL subscales. This study shows the importance of assessing the impact of transplantation on HrQoL, a meaningful outcome reflecting patients' wellbeing. LTx is associated with significant improvements of HrQol in both self- and parentreports. The comparison between PedsQL-GM and MetabQoL highlighted that MetabQoL demonstrated higher sensitivity in the assessment of diseasespecific domains than the generic PedsQL tool.
Long-Term outcome of infantile onset pompe disease patients treated with enzyme replacement therapy
(2024)
Background: Enzyme replacement therapy (ERT) with recombinant human alglucosidase alfa (rhGAA) was approved in Europe in 2006. Nevertheless, data on the long-term outcome of infantile onset Pompe disease (IOPD) patients at school age is still limited.
Objective: We analyzed in detail cardiac, respiratory, motor, and cognitive function of 15 German-speaking patients aged 7 and older who started ERT at a median age of 5 months.
Results: Starting dose was 20 mg/kg biweekly in 12 patients, 20 mg/kg weekly in 2, and 40 mg/kg weekly in one patient. CRIM-status was positive in 13 patients (86.7%) and negative or unknown in one patient each (6.7%). Three patients (20%) received immunomodulation. Median age at last assessment was 9.1 (7.0–19.5) years. At last follow-up 1 patient (6.7%) had mild cardiac hypertrophy, 6 (42.9%) had cardiac arrhythmias, and 7 (46.7%) required assisted ventilation. Seven patients (46.7%) achieved the ability to walk independently and 5 (33.3%) were still ambulatory at last follow-up. Six patients (40%) were able to sit without support, while the remaining 4 (26.7%) were tetraplegic. Eleven patients underwent cognitive testing (Culture Fair Intelligence Test), while 4 were unable to meet the requirements for cognitive testing. Intelligence quotients (IQs) ranged from normal (IQ 117, 102, 96, 94) in 4 patients (36.4%) to mild developmental delay (IQ 81) in one patient (9.1%) to intellectual disability (IQ 69, 63, 61, 3x < 55) in 6 patients (54.5%). White matter abnormalities were present in 10 out of 12 cerebral MRIs from 7 patients.
Measuring what matters
(2023)
Patient reported outcomes (PROs) are generally defined as ‘any report of the status of a patient's health condition that comes directly from the patient, without interpretation of the patient's response by a clinician or anyone else’. A broader definition of PRO also includes ‘any information on the outcomes of health care obtained directly from patients without modification by clinicians or other health care professionals’. Following this approach, PROs encompass subjective perceptions of patients on how they function or feel not only in relation to a health condition but also to its treatment as well as concepts such as health-related quality of life (HrQoL), information on the functional status of a patient, signs and symptoms and symptom burden. PRO measurement instruments (PROMs) are mostly questionnaires and inform about what patients can do and how they feel. PROs and PROMs have not yet found unconditional acceptance and wide use in the field of inborn errors of metabolism. This review summarises the importance and usefulness of PROs in research, drug legislation and clinical care and informs about quality standards, development, and potential methodological shortfalls of PROMs. Inclusion of PROs measured with high-quality, well-selected PROMs into clinical care, drug legislation, and research helps to identify unmet needs, improve quality of care, and define outcomes that are meaningful to patients. The field of IEM should open to new methodological approaches such as the definition of core sets of variables including PROs to be systematically assessed in specific metabolic conditions and new collaborations with PRO experts, such as psychologists to facilitate the systematic collection of meaningful data.
Why do some countries assign a major role to wind energy in decarbonizing their electricity systems, while others are much less committed to this technology? We argue that processes of (de-)legitimation, driven by discourse coalitions who strategically employ certain storylines in public debates, provide part of the answer. To illustrate our approach, we comparatively investigate public discourses surrounding wind energy in Austria and Switzerland, two countries that differ strongly in wind energy deployment. By combining a qualitative content analysis and a discourse network analysis of 808 newspaper articles published 2010–2020, we identify four distinct sets of storylines used to either delegitimize or legitimize the technology. Our study indicates that low deployment rates in Switzerland can be related to the prominence of delegitimizing storylines in the public discourse, which result in a rather low socio-political acceptance of wind energy. In Austria, by contrast, there is more consistent support for wind energy by discourse coalitions using a broad set of legitimizing storylines. By bridging the related but separate literatures of technology legitimacy and social acceptance, our study contributes to a better understanding of socio-political conflict and divergence in low-carbon technological pathways.
A step change is needed in the deployment of renewable energy if the triple challenge of ensuring climate change mitigation, energy security, and energy affordability is to be met. Yet, social acceptance of infrastructure projects and policies remains a key concern. While there has been decades of fruitful research on the social acceptance of wind energy and other renewables, much of the extant research is cross-sectional in nature, failing to capture the important dynamic processes that can make or break renewable energy projects. This paper introduces a Special Issue of Energy Policy which focuses on the neglected topic of the dynamics of social acceptance of renewable energy, drawing on contributions made at an international research conference held in St. Gallen (Switzerland) in June 2022. In addition to introducing these papers and drawing out common themes, we also seek to offer some conceptual clarity on the issue of dynamics in social acceptance, taking into account the influence of time, power, and scale in shaping decision-making processes. We conclude by highlighting a number of avenues of potential future research.
X-ray microtomography is a nondestructive, three-dimensional inspection technique applied across a vast range of fields and disciplines, ranging from research to industrial, encompassing engineering, biology, and medical research. Phasecontrast imaging extends the domain of application of x-ray microtomography to classes of samples that exhibit weak attenuation, thus appearing with poor contrast in standard x-ray imaging. Notable examples are low-atomic-number materials, like carbon-fiber composites, soft matter, and biological soft tissues.We report on a compact and cost-effective system for x-ray phase-contrast microtomography. The system features high sensitivity to phase gradients and high resolution, requires a low-power sealed x-ray tube, a single optical element, and fits in a small footprint. It is compatible with standard x-ray detector technologies: in our experiments, we have observed that single-photon counting offered higher angular sensitivity, whereas flat panels provided a larger field of view. The system is benchmarked against knownmaterial phantoms, and its potential for soft-tissue three-dimensional imaging is demonstrated on small-animal organs: a piglet esophagus and a rat heart.We believe that the simplicity of the setupwe are proposing, combined with its robustness and sensitivity, will facilitate accessing quantitative x-ray phase-contrast microtomography as a research tool across disciplines, including tissue engineering, materials science, and nondestructive testing in general.
Parametric anti-resonance is a phenomenon that occurs in systems with at least two degrees of freedom; this can be achieved by periodically exciting some parameters of the system. The effect of this properly tuned periodicity is to increase the dissipation in the system, which leads to a raising in the effective damping of vibrations. This contribution presents the design of an open-loop control to reduce the settling time using the anti-resonance concept. The control signal consists of a quasi-periodic signal capable of transferring the system’s oscillations from one mode to another mode of the system. The general averaging technique is used to characterize the dynamics, particularly the so-called slow dynamics of motion. With this analysis, the control signal is designed for the potential application of a microelectromechanical sensor arrangement; for this specific example, up to 96.8% reduction of settling time is achieved.
In this work, parametric excitation is introduced in a fully balanced flexible rotor mounted on two identical active gas foil bearings. The active gas foil bearings change the top foil shape harmonically with a specific amplitude and frequency. The deformable foil shape is approximated by an analytical function, while the gas pressure distribution is evaluated by the numerical solution of the Reynolds equation for compressible flow. The harmonic variation of the foil shape generates a respective variation in the bearings’ stiffness and damping properties and the system experiences parametric resonances and antiresonances in specific excitation frequencies. The nonlinear gas bearing forces generate bifurcations in the solutions of the system at certain rotating speeds and excitation frequencies; period doubling and Neimark-Sacker bifurcations are noticed in the examined system, and their progress is evaluated as the two bifurcation parameters (rotating speed and parametric excitation frequency) are changed, though a codimension-2 numerical continuation of limit cycles. It is found that at specific range of excitation frequency there are parametric anti-resonances and the bifurcations collide and vanish. Therefore, a bifurcation-free operating range is established and the system can operate stable at a wide speed range.
Digitalization is changing business models and operational processes. At the same time, improved data availability and powerful analytical methods are influencing controlling and increasingly require the use of statistical and information technology skills and knowledge. Using a case study from marketing controlling, the article shows the use of business analytics methods and addresses the tasks of controlling in the digital age.
By a simple femtosecond laser process, we fabricated metal-oxide/gold composite films for electrical and optical gas sensors. We designed a dripple wavelength AWG-spectrometer, matched to the plasma absorption wavelength region of the composite films. H2/CO absorptions fit well with the AWG design for multi gas detection sensor arrays
Pooled data from published reports on infants with clinically diagnosed vitamin B12 (B12) deficiency were analyzed with the purpose of describing the presentation, diagnostic approaches, and risk factors for the condition to inform prevention strategies. An electronic (PubMed database) and manual literature search following the PRISMA approach was conducted (preregistration with the Open Science Framework, accessed on 15 February 2023). Data were described and analyzed using correlation analyses, Chi-square tests, ANOVAs, and regression analyses, and 102 publications (292 cases) were analyzed. The mean age at first symptoms (anemia, various neurological symptoms) was four months; the mean time to diagnosis was 2.6 months. Maternal B12 at diagnosis, exclusive breastfeeding, and a maternal diet low in B12 predicted infant B12, methylmalonic acid, and total homocysteine. Infant B12 deficiency is still not easily diagnosed. Methylmalonic acid and total homocysteine are useful diagnostic parameters in addition to B12 levels. Since maternal B12 status predicts infant B12 status, it would probably be advantageous to target women in early pregnancy or even preconceptionally to prevent infant B12 deficiency, rather than to rely on newborn screening that often does not reliably identify high-risk children.
Grey Box models provide an important approach for control analysis in the Heating, Ventilation and Air Conditioning (HVAC) sector. Grey Box models consist of physical models where parameters are estimated from data. Due to the vast amount of component models that can be found in literature, the question arises, which component models perform best on a given system or dataset? This question is investigated systematically using a test case system with real operational data. The test case system consists of a HVAC system containing an energy recovery unit (ER), a heating coil (HC) and a cooling coil (CC). For each component, several suitable model variants from the literature are adapted appropriately and implemented. Four model variants are implemented for the ER and five model variants each for the HC and CC. Further, three global optimization algorithms and four local optimization algorithms to solve the nonlinear least squares system identification are implemented, leading to a total of 700 combinations. The comparison of all variants shows that the global optimization algorithms do not provide significantly better solutions. Their runtimes are significantly higher. Analysis of the models shows a dependency of the model accuracy on the number of total parameters.
Purpose – The purpose of this study is to explore the exogenous and endogenous drivers of the high-growth of Unicorn start-ups along their life cycle, with a particular focus on Unicorns in the fintech industry.
Design/methodology/approach – The study employs an explorative longitudinal analysis with a matched pair of two cases of Unicorns start-ups with similar antecedent features to understand holistically drivers over the longer term.
Findings – High-growth patterns over the longer term are the result of a combined industry- and company-life cycle perspective. Drivers and growth patterns vary significantly according to the time of entry in the industry and
its development status. The findings are systematised within a set of propositions to be tested in future research.
Research limitations/implications – The limitations lie in empirical evidence, as the analysis is limited to one matched-pair. The revealed Unicorns’ drivers for long-term growth might encourage future research to further investigate these drivers on a larger scale.
Practical implications – The study offers practical recommendations for start-ups with high-growth ambitions and advice to policy makers regarding the development of tailor-made support programs.
Originality/value – The study significantly extends extant work on growth and high-growth by examining endogenous and exogenous triggers over time and by linking the Unicorn-life cycle to the industry life cycle, an approach which has, to the best of the authors’ knowledge, not yet been applied.
International Entrepreneurship explains the opportunities and challenges facing internationalizing entrepreneurial ventures. The book inlcudes a thorough discussion of fundamentals as well as contemporary research findings. Numerous cases, featuring diverse contexts, illustrate theory and help classroom use.
The main aims of this work are the validation of the developed process of gluing a single-mode optical fiber array with a photonic chip and the selection of a more suitable adhesive from the two adhesives being compared. An active alignment system was used for adjusting the two optical fiber arrays to a photonics chip. The gluing was done by two compared UV curable adhesives applied in the optical path. The insertion losses of glued coupling were measured and investigated at two discrete wavelengths 1310 nm and 1550 nm during temperature testing in the climatic chamber according to Telcordia GR_1209_Corei04 [3]. The measurement, investigation, and comparison of insertion losses of the glued coupling at the spectral band from 1530 nm to 1570 nm were done immediately after gluing process and after three temperature cycles in the climatic chamber with one month delay.
In 2021, a prominent Austria dairy producer suffered from an IT attack and was completely paralysed. Without clearly defined mitigation measures in place, major disruptions were caused alongside the whole supply chain, including logistics service providers, governmental food safety bodies, as well as retailers (i.e., supermarkets and convenience stores). In this paper, we ask the question how digitisation and digital transformation impact IT security, especially when considering the complex company ecosystems of food production and food supply chains in Austria. The problem statement stems from a gap in knowledge of key differences in approaches towards IT security, resilience, risk management and especially business interfaces between food suppliers, supermarkets, distributors, logistics and other service providers. In order to answer related research questions, firstly, the authors conduct literature research, and highlight common guidelines and standardisation as well as look at state-based recommendations for critical infrastructure. In a second step, the paper describes a quantitative and qualitative survey with Austrian food companies (producers and retailers) which is described in detail in the paper. A description of recommended measures for the industry, further steps, as well as an outlook conclude the paper.
Background: Cardiovascular disease is the major cause of death worldwide. Although knowledge regarding diagnosing and treating cardiovascular disease has increased dramatically, secondary prevention remains insufficiently implemented due to failure among affected individuals to adhere to guideline recommendations. This has continued to lead to high morbidity and mortality rates. Involving patients in their healthcare and facilitating their active roles in their chronic disease management is an opportunity to meet the needs of the increasing number of cardio-vascular patients. However, simple recall of advice regarding a more preventive lifestyle does not affect sustainable behavioral lifestyle changes. We investigate the effect of plaque visualization combined with low-threshold daily lifestyle tasks using the smartphone app PreventiPlaque to evaluate change in cardiovascular risk profile. Methods: and study design: This randomized, controlled clinical trial includes 240 participants with ultrasound evidence of atherosclerotic plaque in one or both carotid arteries, defined as focal thickening of the vessel wall measuring 50% more than the regular vessel wall. A criterion for participation is access to a smartphone suitable for app usage. The participants are randomly assigned to an intervention or a control group. While both groups receive the standard of care, the intervention group has additional access to the PreventiPlaque app during the 12-month follow-up. The app includes daily tasks that promote a healthier lifestyle in the areas of smoking cessation, medication adherence, physical activity, and diet. The impact of plaque visualization and app use on the change in cardiovascular risk profile is assessed by SCORE2. Feasibility and effectiveness of the PreventiPlaque app are evaluated using standardized and validated measures for patient feedback.
The production of liquid-gas mixtures with desired properties still places high demands on process technology and is usually realized in bubble columns. The physical calculation models used have individual dimensionless factors which, depending on the application, are only valid for small ranges consisting of flow velocity, nozzle geometry and test setup. An iterative but time-consuming design of such dispersion processes is used in industry for producing a liquid-gas mixture according to desired requirements. In the present investigation, we accelerate the necessary design loops by setting up a physical model, which consists of several subsystems that are enriched by dedicated experiments to realize liquid-gas dispersions with low volume fraction and small air bubble diameters in oil. Our approach allows the extraction of individual dimensionless factors from maps of the introduced subsystems. These maps allow for targeted corrective measures of a production process for keeping the quality. The calculation-based approach avoids the need for performing iterative design loops. Overall, this approach supports the controlled generation of liquid-gas mixtures.
Creating a schedule to perform certain actions in a realworld environment typically involves multiple types of uncertainties. To create a plan which is robust towards uncertainties, it must stay flexible while attempting to be reliable and as close to optimal as possible. A plan is reliable if an adjustment to accommodate for a new requirement causes only a few disruptions. The system needs to be able to adapt to the schedule if unforeseen circumstances make planned actions impossible, or if an unlikely event would enable the system to follow a better path. To handle uncertainties, the used methods need to be dynamic and adaptive. The planning algorithms must be able to re-schedule planned actions and need to adapt the previously created plan to accommodate new requirements without causing critical disruptions to other required actions.
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
Tap or swipe
(2023)
Demand-side management approaches that exploit the temporal flexibility of electric vehicles have attracted much attention in recent years due to the increasing market penetration. These demand-side management measures contribute to alleviating the burden on the power system, especially in distribution grids where bottlenecks are more prevalent. Electric vehicles can be defined as an attractive asset for distribution system operators, which have the potential to provide grid services if properly managed. In this thesis, first, a systematic investigation is conducted for two typically employed demand-side management methods reported in the literature: A voltage droop control-based approach and a market-driven approach. Then a control scheme of decentralized autonomous demand side management for electric vehicle charging scheduling which relies on a unidirectionally communicated grid-induced signal is proposed. In all the topics considered, the implications on the distribution grid operation are evaluated using a set of time series load flow simulations performed for representative Austrian distribution grids. Droop control mechanisms are discussed for electric vehicle charging control which requires no communication. The method provides an economically viable solution at all penetrations if electric vehicles charge at low nominal power rates. However, with the current market trends in residential charging equipment especially in the European context where most of the charging equipment is designed for 11 kW charging, the technical feasibility of the method, in the long run, is debatable. As electricity demand strongly correlates with energy prices, a linear optimization algorithm is proposed to minimize charging costs, which uses next-day market prices as the grid-induced incentive function under the assumption of perfect user predictions. The constraints on the state of charge guarantee the energy required for driving is delivered without failure. An average energy cost saving of 30% is realized at all penetrations. Nevertheless, the avalanche effect due to simultaneous charging during low price periods introduces new power peaks exceeding those of uncontrolled charging. This obstructs the grid-friendly integration of electric vehicles.
In the era of digital transformation an evolution takes place. Following this, new perspectives concerning leadership are required, especially in virtual teams. Shared Leadership is a promising leadership form to meet the challenges in a virtual team setting. Particularly, studies show that shared leadership increases performance, team creativity and innovative behavior. Moreover, the responsibility is distributed among several, not one individual. Nevertheless, it is unclear, which skills are needed in shared leadership teams and how they could be trained. Therefore, we develop a conceptual framework to pave the way for an empirical inquiry of the skills for and the role of shared leadership. Moreover, we encourage the discussion, whether the current leadership development is still viable and offer practical implications to develop shared leadership.
A model is presented that allows for the calculation of the success probability by which a vanilla Evolution Strategy converges to the global optimizer of the Rastrigin test function. As a result a population size scaling formula will be derived that allows for an estimation of the population size needed to ensure a high convergence security depending on the search space dimensionality.
The thorny issue of time
(2023)
Digital twin as enabler of business model innovation for infrastructure construction projects
(2023)
Emerging technologies and methods are becoming an important element of the construction industry. Digital Twins are used as a base to store data in BIM models and make use out of the data respectively make the data visible. The transparency in all phases of the lifecycle of building and infrastructure assets is crucial in order to get a more efficient lifecycle of planning, construction and maintenance. Whereas other industries increased performance in these phases by making use out of the data, construction industry is stuck in traditional methods and business models. In this paper we propose a concept that focuses on the digital production twin. The comparison of planning data with As-Is production data can empower a data driven continuous improvement process and support the decision making process of future innovations and suitable business models. This paper outlines the possibility to use the data stored in a digital twin with regards to the evaluation of possible business models.
Through mandatory ESG (environmental, social, governance) reporting large companies must disclose their ESG activities showing how sustainability risks are incorporated in their decision-making and production processes. This disclosure obligation, however, does not apply to small and medium-sized enterprises (SME), creating a gap in the ESG dataset. Banks are therefore required to collect sustainability data of their SME customers independently to ensure complete ESG integration in the risk analysis process for loans. In this paper, we examine ESG risk analysis through a smart science approach laying the focus on possible value outcomes of sustainable smart services for banks as well as for their (SME) customers. The paper describes ESG factors, how services can be derived from them, targeted metrics of ESG and an ESG Service Creation Framework (business ecosystem building, process model, and value creation). The description of an exemplary use case highlighting the necessary ecosystem for service creation as well as the created value concludes the paper.
The role of entrepreneurs and intrapreneurs in the current zeitgeist is to drive innovation, re-shape rigid, established processes in business as well as for consumers. They use new viewpoints to pioneer new (business) models which focus on ‘smartness’ rather than the purely monetary and short-sighted models of yesteryear. Fostering and supporting the culture of this current zeitgeist is a mayor challenge for entre- and intrapreneurial support infrastructures, namely startup centres and innovation hubs of universities and other public institutions as well as innovation centres of private companies. Hereby, support may range from access to funding over provision of resources such as offices or computing hardware to coaching in the development of business ideas and strategic roadmaps for product and service deployment. In this paper, we focus on describing the status-quo of afore- mentioned support infrastructures in Vorarlberg and the Lake Constance region, then extend the scope to existing (international) approaches for aiding founders and inno- vators in the development of smart services. An analysis of success stories of the Vorarlberg startup centre ‘startupstube’ and other initiatives including their compar- ison to international counterparts builds the basis for a methodological framework for (service science) coaching in entre- and intrapreneurial support infrastructures. The paper is concluded by the description of a framework for choosing the right methods and tools to create service value in entre-/intrapreneurship based upon tested, proven know-how and for defining support infrastructure needs based upon pre-defined stakeholder and target groups as well as the (industry) sectors of the innovators.
In this paper, a 256-channel, 10-GHz arrayed waveguide gratings demultiplexer for ultra-dense wavelength division multiplexing was designed using an in-house developed tool called AWG-Parameters. The AWG demultiplexer was designed for a central wavelength of 1550 nm and the structure was simulated in PHASAR tool from Optiwave. Two different AWG designs were developed and the influence of the design parameters on the AWG performance was studied.
Design, simulation, and optimization of the 1×4 optical three-dimensional multimode interference splitter using IP-Dip polymer as a core and polydimethylsiloxane (PDMS) Sylgard 184 as a cladding is demonstrated. The splitter was simulated by using beam propagation method in BeamPROP simulation module of RSoft photonic tool and optimized for an operating wavelength of 1.55 μm . According to the minimum insertion loss, the dimensions of the splitter were optimized for a waveguide with a core size of 4×4 μm2 . The objective of the study is to create the design for fabrication by three-dimensional direct laser writing optical lithography.
Activation of heat pump flexibilities is a viable solution to support balancing the grid via Demand Side Management measures and fulfill the need for flexibility options. Aggregators as interface between prosumers, distribution system operators and balance responsible parties face the challenge due to data privacy and technical restrictions to transform prosumer information into aggregated available flexibility to enable trading thereof. Thereby, literature lacks a generic, applicable and widely accepted flexibility estimation method for heat pumps,which incorporates reduced sensor and system information, system- and demand-dependent behaviour. In this paper, we adapt and extend a method from literature, by incorporating domain knowledge to overcome reduced sensor and system information. We apply data of five real-world heat pump systems, distinguish operation modes, estimate power and energy flexibility of each single heat pump system, proof transferability of the method, and aggregate the flexibilities available to showcase a small HP pool as a proof of concept.