Refine
Year of publication
Document Type
- Conference Proceeding (306)
- Article (279)
- Part of a Book (53)
- Book (19)
- Doctoral Thesis (9)
- Report (6)
- Working Paper (4)
- Other (3)
- Periodical (3)
- Part of Periodical (3)
Institute
- Forschungszentrum Mikrotechnik (234)
- Forschungszentrum Business Informatics (149)
- Technik | Engineering & Technology (125)
- Department of Computer Science (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (112)
- Wirtschaft (105)
- Forschungszentrum Energie (77)
- Didaktik (mit 31.03.2021 aufgelöst; Integration ins TELL Center) (37)
- Forschungszentrum Human Centred Technologies (35)
- Soziales & Gesundheit (33)
- Josef Ressel Zentrum für Materialbearbeitung (27)
Language
- English (686) (remove)
Is part of the Bibliography
- yes (686) (remove)
Keywords
- Laser ablation (11)
- Y-branch splitter (11)
- arrayed waveguide gratings (11)
- photonics (8)
- Evolution strategy (7)
- Demand side management (6)
- Optimization (6)
- integrated optics (6)
- Arrayed waveguide gratings (5)
- Evolution Strategies (5)
In this paper, we consider the question of data aggregation using the practical example of emissions data for economic activities for the sustainability assessment of regional bank clients. Given the current scarcity of company-specific emission data, an approximation relies on using available public data. These data are reported in different standards in different sources. To determine a mapping between the different standards, an adaptation to the Covariance Matrix Self-Adaptation Evolution Strategy is proposed. The obtained results show that high-quality mappings are found. Nevertheless, our approach is transferable to other data compatibility problems. These can be found in the merging of emissions data for other countries, or in bridging the gap between completely different data sets.
This study presents different approaches to increase the sensing area of NiO based semiconducting metal oxide gas sensors. Micro- and nanopatterned laser induced periodic surface structures (LIPSS) are generated on silicon and Si/SiO2 substrates. The surface morphologies of the fabricated samples are examined by FE SEM. We select the silicon samples with an intermediate Si3N4 layer due to its superior isolation quality over the thermal oxide for evaluating the hydrogen and acetone sensitivity of a NiO based test sensor.
Objectives: The MetabQoL 1.0 is the first disease-specific health related quality of life (HrQoL) questionnaire for patients with intoxication-type inherited metabolic disorders. Our aim was to assess the validity and reliability of the MetabQoL 1.0, and to investigate neuropsychiatric burden in our patient population. Methods: Data from 29 patients followed at a single center, aged between 8 and 18 years with the diagnosis of methylmalonic acidemia (MMA), propionic acidemia (PA) or isovaleric acidemia (IVA), and their parents were included. The Pediatric Quality of Life Inventory (PedsQoL) was used to evaluate the validity and reliability of MetabQoL 1.0.
Results: The MetabQoL 1.0 was shown to be valid and reliable (Cronbach's alpha: 0.64–0.9). Fourteen out of the 22 patients (63.6%) formally evaluated had neurological findings. Of note, 17 out of 20 patients (85%) had a psychiatric disorder when evaluated formally by a child and adolescent psychiatrist. The median mental scores of the MetabQoL 1.0 proxy report were significantly higher than those of the self report (p = 0.023). Patients with neonatal-onset disease had higher MetabQoL 1.0 proxy physical (p = 0.008), mental (p = 0.042), total scores (p = 0.022); and self report social (p = 0.007) and total scores (p = 0.043) than those with later onset disease.
Conclusions: This study continues to prove that the MetabQoL 1.0 is an effective tool to measure what matters in intoxication-type inherited metabolic disorders. Our results highlight the importance of clinical assessment complemented by patient reported outcomes which further expands the evaluation toolbox of inherited metabolic diseases.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
Highly-sensitive single-step sensing of levodopa by swellable microneedle-mounted nanogap sensors
(2023)
Microneedle (MN) sensing of biomarkers in interstitial fluid (ISF) can overcome the challenges of self-diagnosis of diseases by a patient, such as blood sampling, handling, and measurement analysis. However, the MN sensing technologies still suffer from poor measurement accuracy due to the small amount of target molecules present in ISF, and require multiple steps of ISF extraction, ISF isolation from MN, and measurement with additional equipment. Here, we present a swellable MN-mounted nanogap sensor that can be inserted into the skin tissue, absorb ISF rapidly, and measure biomarkers in situ by amplifying the measurement signals by redox cycling in nanogap electrodes. We demonstrate that the MN-nanogap sensor measures levodopa (LDA), medication for Parkinson disease, down to 100 nM in an aqueous solution, and 1 μM in both the skin-mimicked gelatin phantom and porcine skin.
Organic acidurias (OAs), urea-cycle disorders (UCDs), and maple syrup urine disease (MSUD) belong to the category of intoxication-type inborn errors of metabolism (IT-IEM). Liver transplantation (LTx) is increasingly utilized in IT-IEM. However, its impact has been mainly focused on clinical outcome measures and rarely on health-related quality of life (HRQoL). Aim of the study was to investigate the impact of LTx on HrQoL in IT-IEMs. This single center prospective study involved 32 patients (15 OA, 11 UCD, 6 MSUD; median age at LTx 3.0 years, range 0.8–26.0). HRQoL was assessed pre/post transplantation by PedsQL-General Module 4.0 and by MetabQoL 1.0, a specifically designed tool for IT-IEM. PedsQL highlighted significant post-LTx improvements in total and physical functioning in both patients' and parents' scores. According to age at transplantation (≤3 vs. >3 years), younger patients showed higher post-LTx scores on Physical (p = 0.03), Social (p < 0.001), and Total (p =0.007) functioning. MetabQoL confirmed significant post-LTx changes in Total and Physical functioning in both patients and parents scores (p ≤ 0.009). Differently from PedsQL, MetabQoL Mental (patients p = 0.013, parents p = 0.03) and Social scores (patients p = 0.02, parents p = 0.012) were significantly higher post-LTx. Significant improvements (p = 0.001–0.04) were also detected both in self- and proxy-reports for almost all MetabQoL subscales. This study shows the importance of assessing the impact of transplantation on HrQoL, a meaningful outcome reflecting patients' wellbeing. LTx is associated with significant improvements of HrQol in both self- and parentreports. The comparison between PedsQL-GM and MetabQoL highlighted that MetabQoL demonstrated higher sensitivity in the assessment of diseasespecific domains than the generic PedsQL tool.
Long-Term outcome of infantile onset pompe disease patients treated with enzyme replacement therapy
(2024)
Background: Enzyme replacement therapy (ERT) with recombinant human alglucosidase alfa (rhGAA) was approved in Europe in 2006. Nevertheless, data on the long-term outcome of infantile onset Pompe disease (IOPD) patients at school age is still limited.
Objective: We analyzed in detail cardiac, respiratory, motor, and cognitive function of 15 German-speaking patients aged 7 and older who started ERT at a median age of 5 months.
Results: Starting dose was 20 mg/kg biweekly in 12 patients, 20 mg/kg weekly in 2, and 40 mg/kg weekly in one patient. CRIM-status was positive in 13 patients (86.7%) and negative or unknown in one patient each (6.7%). Three patients (20%) received immunomodulation. Median age at last assessment was 9.1 (7.0–19.5) years. At last follow-up 1 patient (6.7%) had mild cardiac hypertrophy, 6 (42.9%) had cardiac arrhythmias, and 7 (46.7%) required assisted ventilation. Seven patients (46.7%) achieved the ability to walk independently and 5 (33.3%) were still ambulatory at last follow-up. Six patients (40%) were able to sit without support, while the remaining 4 (26.7%) were tetraplegic. Eleven patients underwent cognitive testing (Culture Fair Intelligence Test), while 4 were unable to meet the requirements for cognitive testing. Intelligence quotients (IQs) ranged from normal (IQ 117, 102, 96, 94) in 4 patients (36.4%) to mild developmental delay (IQ 81) in one patient (9.1%) to intellectual disability (IQ 69, 63, 61, 3x < 55) in 6 patients (54.5%). White matter abnormalities were present in 10 out of 12 cerebral MRIs from 7 patients.
Measuring what matters
(2023)
Patient reported outcomes (PROs) are generally defined as ‘any report of the status of a patient's health condition that comes directly from the patient, without interpretation of the patient's response by a clinician or anyone else’. A broader definition of PRO also includes ‘any information on the outcomes of health care obtained directly from patients without modification by clinicians or other health care professionals’. Following this approach, PROs encompass subjective perceptions of patients on how they function or feel not only in relation to a health condition but also to its treatment as well as concepts such as health-related quality of life (HrQoL), information on the functional status of a patient, signs and symptoms and symptom burden. PRO measurement instruments (PROMs) are mostly questionnaires and inform about what patients can do and how they feel. PROs and PROMs have not yet found unconditional acceptance and wide use in the field of inborn errors of metabolism. This review summarises the importance and usefulness of PROs in research, drug legislation and clinical care and informs about quality standards, development, and potential methodological shortfalls of PROMs. Inclusion of PROs measured with high-quality, well-selected PROMs into clinical care, drug legislation, and research helps to identify unmet needs, improve quality of care, and define outcomes that are meaningful to patients. The field of IEM should open to new methodological approaches such as the definition of core sets of variables including PROs to be systematically assessed in specific metabolic conditions and new collaborations with PRO experts, such as psychologists to facilitate the systematic collection of meaningful data.
Why do some countries assign a major role to wind energy in decarbonizing their electricity systems, while others are much less committed to this technology? We argue that processes of (de-)legitimation, driven by discourse coalitions who strategically employ certain storylines in public debates, provide part of the answer. To illustrate our approach, we comparatively investigate public discourses surrounding wind energy in Austria and Switzerland, two countries that differ strongly in wind energy deployment. By combining a qualitative content analysis and a discourse network analysis of 808 newspaper articles published 2010–2020, we identify four distinct sets of storylines used to either delegitimize or legitimize the technology. Our study indicates that low deployment rates in Switzerland can be related to the prominence of delegitimizing storylines in the public discourse, which result in a rather low socio-political acceptance of wind energy. In Austria, by contrast, there is more consistent support for wind energy by discourse coalitions using a broad set of legitimizing storylines. By bridging the related but separate literatures of technology legitimacy and social acceptance, our study contributes to a better understanding of socio-political conflict and divergence in low-carbon technological pathways.
A step change is needed in the deployment of renewable energy if the triple challenge of ensuring climate change mitigation, energy security, and energy affordability is to be met. Yet, social acceptance of infrastructure projects and policies remains a key concern. While there has been decades of fruitful research on the social acceptance of wind energy and other renewables, much of the extant research is cross-sectional in nature, failing to capture the important dynamic processes that can make or break renewable energy projects. This paper introduces a Special Issue of Energy Policy which focuses on the neglected topic of the dynamics of social acceptance of renewable energy, drawing on contributions made at an international research conference held in St. Gallen (Switzerland) in June 2022. In addition to introducing these papers and drawing out common themes, we also seek to offer some conceptual clarity on the issue of dynamics in social acceptance, taking into account the influence of time, power, and scale in shaping decision-making processes. We conclude by highlighting a number of avenues of potential future research.
X-ray microtomography is a nondestructive, three-dimensional inspection technique applied across a vast range of fields and disciplines, ranging from research to industrial, encompassing engineering, biology, and medical research. Phasecontrast imaging extends the domain of application of x-ray microtomography to classes of samples that exhibit weak attenuation, thus appearing with poor contrast in standard x-ray imaging. Notable examples are low-atomic-number materials, like carbon-fiber composites, soft matter, and biological soft tissues.We report on a compact and cost-effective system for x-ray phase-contrast microtomography. The system features high sensitivity to phase gradients and high resolution, requires a low-power sealed x-ray tube, a single optical element, and fits in a small footprint. It is compatible with standard x-ray detector technologies: in our experiments, we have observed that single-photon counting offered higher angular sensitivity, whereas flat panels provided a larger field of view. The system is benchmarked against knownmaterial phantoms, and its potential for soft-tissue three-dimensional imaging is demonstrated on small-animal organs: a piglet esophagus and a rat heart.We believe that the simplicity of the setupwe are proposing, combined with its robustness and sensitivity, will facilitate accessing quantitative x-ray phase-contrast microtomography as a research tool across disciplines, including tissue engineering, materials science, and nondestructive testing in general.
Parametric anti-resonance is a phenomenon that occurs in systems with at least two degrees of freedom; this can be achieved by periodically exciting some parameters of the system. The effect of this properly tuned periodicity is to increase the dissipation in the system, which leads to a raising in the effective damping of vibrations. This contribution presents the design of an open-loop control to reduce the settling time using the anti-resonance concept. The control signal consists of a quasi-periodic signal capable of transferring the system’s oscillations from one mode to another mode of the system. The general averaging technique is used to characterize the dynamics, particularly the so-called slow dynamics of motion. With this analysis, the control signal is designed for the potential application of a microelectromechanical sensor arrangement; for this specific example, up to 96.8% reduction of settling time is achieved.
In this work, parametric excitation is introduced in a fully balanced flexible rotor mounted on two identical active gas foil bearings. The active gas foil bearings change the top foil shape harmonically with a specific amplitude and frequency. The deformable foil shape is approximated by an analytical function, while the gas pressure distribution is evaluated by the numerical solution of the Reynolds equation for compressible flow. The harmonic variation of the foil shape generates a respective variation in the bearings’ stiffness and damping properties and the system experiences parametric resonances and antiresonances in specific excitation frequencies. The nonlinear gas bearing forces generate bifurcations in the solutions of the system at certain rotating speeds and excitation frequencies; period doubling and Neimark-Sacker bifurcations are noticed in the examined system, and their progress is evaluated as the two bifurcation parameters (rotating speed and parametric excitation frequency) are changed, though a codimension-2 numerical continuation of limit cycles. It is found that at specific range of excitation frequency there are parametric anti-resonances and the bifurcations collide and vanish. Therefore, a bifurcation-free operating range is established and the system can operate stable at a wide speed range.
Digitalization is changing business models and operational processes. At the same time, improved data availability and powerful analytical methods are influencing controlling and increasingly require the use of statistical and information technology skills and knowledge. Using a case study from marketing controlling, the article shows the use of business analytics methods and addresses the tasks of controlling in the digital age.
By a simple femtosecond laser process, we fabricated metal-oxide/gold composite films for electrical and optical gas sensors. We designed a dripple wavelength AWG-spectrometer, matched to the plasma absorption wavelength region of the composite films. H2/CO absorptions fit well with the AWG design for multi gas detection sensor arrays
Pooled data from published reports on infants with clinically diagnosed vitamin B12 (B12) deficiency were analyzed with the purpose of describing the presentation, diagnostic approaches, and risk factors for the condition to inform prevention strategies. An electronic (PubMed database) and manual literature search following the PRISMA approach was conducted (preregistration with the Open Science Framework, accessed on 15 February 2023). Data were described and analyzed using correlation analyses, Chi-square tests, ANOVAs, and regression analyses, and 102 publications (292 cases) were analyzed. The mean age at first symptoms (anemia, various neurological symptoms) was four months; the mean time to diagnosis was 2.6 months. Maternal B12 at diagnosis, exclusive breastfeeding, and a maternal diet low in B12 predicted infant B12, methylmalonic acid, and total homocysteine. Infant B12 deficiency is still not easily diagnosed. Methylmalonic acid and total homocysteine are useful diagnostic parameters in addition to B12 levels. Since maternal B12 status predicts infant B12 status, it would probably be advantageous to target women in early pregnancy or even preconceptionally to prevent infant B12 deficiency, rather than to rely on newborn screening that often does not reliably identify high-risk children.
Grey Box models provide an important approach for control analysis in the Heating, Ventilation and Air Conditioning (HVAC) sector. Grey Box models consist of physical models where parameters are estimated from data. Due to the vast amount of component models that can be found in literature, the question arises, which component models perform best on a given system or dataset? This question is investigated systematically using a test case system with real operational data. The test case system consists of a HVAC system containing an energy recovery unit (ER), a heating coil (HC) and a cooling coil (CC). For each component, several suitable model variants from the literature are adapted appropriately and implemented. Four model variants are implemented for the ER and five model variants each for the HC and CC. Further, three global optimization algorithms and four local optimization algorithms to solve the nonlinear least squares system identification are implemented, leading to a total of 700 combinations. The comparison of all variants shows that the global optimization algorithms do not provide significantly better solutions. Their runtimes are significantly higher. Analysis of the models shows a dependency of the model accuracy on the number of total parameters.
Purpose – The purpose of this study is to explore the exogenous and endogenous drivers of the high-growth of Unicorn start-ups along their life cycle, with a particular focus on Unicorns in the fintech industry.
Design/methodology/approach – The study employs an explorative longitudinal analysis with a matched pair of two cases of Unicorns start-ups with similar antecedent features to understand holistically drivers over the longer term.
Findings – High-growth patterns over the longer term are the result of a combined industry- and company-life cycle perspective. Drivers and growth patterns vary significantly according to the time of entry in the industry and
its development status. The findings are systematised within a set of propositions to be tested in future research.
Research limitations/implications – The limitations lie in empirical evidence, as the analysis is limited to one matched-pair. The revealed Unicorns’ drivers for long-term growth might encourage future research to further investigate these drivers on a larger scale.
Practical implications – The study offers practical recommendations for start-ups with high-growth ambitions and advice to policy makers regarding the development of tailor-made support programs.
Originality/value – The study significantly extends extant work on growth and high-growth by examining endogenous and exogenous triggers over time and by linking the Unicorn-life cycle to the industry life cycle, an approach which has, to the best of the authors’ knowledge, not yet been applied.
International Entrepreneurship explains the opportunities and challenges facing internationalizing entrepreneurial ventures. The book inlcudes a thorough discussion of fundamentals as well as contemporary research findings. Numerous cases, featuring diverse contexts, illustrate theory and help classroom use.
The main aims of this work are the validation of the developed process of gluing a single-mode optical fiber array with a photonic chip and the selection of a more suitable adhesive from the two adhesives being compared. An active alignment system was used for adjusting the two optical fiber arrays to a photonics chip. The gluing was done by two compared UV curable adhesives applied in the optical path. The insertion losses of glued coupling were measured and investigated at two discrete wavelengths 1310 nm and 1550 nm during temperature testing in the climatic chamber according to Telcordia GR_1209_Corei04 [3]. The measurement, investigation, and comparison of insertion losses of the glued coupling at the spectral band from 1530 nm to 1570 nm were done immediately after gluing process and after three temperature cycles in the climatic chamber with one month delay.
In 2021, a prominent Austria dairy producer suffered from an IT attack and was completely paralysed. Without clearly defined mitigation measures in place, major disruptions were caused alongside the whole supply chain, including logistics service providers, governmental food safety bodies, as well as retailers (i.e., supermarkets and convenience stores). In this paper, we ask the question how digitisation and digital transformation impact IT security, especially when considering the complex company ecosystems of food production and food supply chains in Austria. The problem statement stems from a gap in knowledge of key differences in approaches towards IT security, resilience, risk management and especially business interfaces between food suppliers, supermarkets, distributors, logistics and other service providers. In order to answer related research questions, firstly, the authors conduct literature research, and highlight common guidelines and standardisation as well as look at state-based recommendations for critical infrastructure. In a second step, the paper describes a quantitative and qualitative survey with Austrian food companies (producers and retailers) which is described in detail in the paper. A description of recommended measures for the industry, further steps, as well as an outlook conclude the paper.
Background: Cardiovascular disease is the major cause of death worldwide. Although knowledge regarding diagnosing and treating cardiovascular disease has increased dramatically, secondary prevention remains insufficiently implemented due to failure among affected individuals to adhere to guideline recommendations. This has continued to lead to high morbidity and mortality rates. Involving patients in their healthcare and facilitating their active roles in their chronic disease management is an opportunity to meet the needs of the increasing number of cardio-vascular patients. However, simple recall of advice regarding a more preventive lifestyle does not affect sustainable behavioral lifestyle changes. We investigate the effect of plaque visualization combined with low-threshold daily lifestyle tasks using the smartphone app PreventiPlaque to evaluate change in cardiovascular risk profile. Methods: and study design: This randomized, controlled clinical trial includes 240 participants with ultrasound evidence of atherosclerotic plaque in one or both carotid arteries, defined as focal thickening of the vessel wall measuring 50% more than the regular vessel wall. A criterion for participation is access to a smartphone suitable for app usage. The participants are randomly assigned to an intervention or a control group. While both groups receive the standard of care, the intervention group has additional access to the PreventiPlaque app during the 12-month follow-up. The app includes daily tasks that promote a healthier lifestyle in the areas of smoking cessation, medication adherence, physical activity, and diet. The impact of plaque visualization and app use on the change in cardiovascular risk profile is assessed by SCORE2. Feasibility and effectiveness of the PreventiPlaque app are evaluated using standardized and validated measures for patient feedback.
The production of liquid-gas mixtures with desired properties still places high demands on process technology and is usually realized in bubble columns. The physical calculation models used have individual dimensionless factors which, depending on the application, are only valid for small ranges consisting of flow velocity, nozzle geometry and test setup. An iterative but time-consuming design of such dispersion processes is used in industry for producing a liquid-gas mixture according to desired requirements. In the present investigation, we accelerate the necessary design loops by setting up a physical model, which consists of several subsystems that are enriched by dedicated experiments to realize liquid-gas dispersions with low volume fraction and small air bubble diameters in oil. Our approach allows the extraction of individual dimensionless factors from maps of the introduced subsystems. These maps allow for targeted corrective measures of a production process for keeping the quality. The calculation-based approach avoids the need for performing iterative design loops. Overall, this approach supports the controlled generation of liquid-gas mixtures.
Creating a schedule to perform certain actions in a realworld environment typically involves multiple types of uncertainties. To create a plan which is robust towards uncertainties, it must stay flexible while attempting to be reliable and as close to optimal as possible. A plan is reliable if an adjustment to accommodate for a new requirement causes only a few disruptions. The system needs to be able to adapt to the schedule if unforeseen circumstances make planned actions impossible, or if an unlikely event would enable the system to follow a better path. To handle uncertainties, the used methods need to be dynamic and adaptive. The planning algorithms must be able to re-schedule planned actions and need to adapt the previously created plan to accommodate new requirements without causing critical disruptions to other required actions.
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
Tap or swipe
(2023)
Demand-side management approaches that exploit the temporal flexibility of electric vehicles have attracted much attention in recent years due to the increasing market penetration. These demand-side management measures contribute to alleviating the burden on the power system, especially in distribution grids where bottlenecks are more prevalent. Electric vehicles can be defined as an attractive asset for distribution system operators, which have the potential to provide grid services if properly managed. In this thesis, first, a systematic investigation is conducted for two typically employed demand-side management methods reported in the literature: A voltage droop control-based approach and a market-driven approach. Then a control scheme of decentralized autonomous demand side management for electric vehicle charging scheduling which relies on a unidirectionally communicated grid-induced signal is proposed. In all the topics considered, the implications on the distribution grid operation are evaluated using a set of time series load flow simulations performed for representative Austrian distribution grids. Droop control mechanisms are discussed for electric vehicle charging control which requires no communication. The method provides an economically viable solution at all penetrations if electric vehicles charge at low nominal power rates. However, with the current market trends in residential charging equipment especially in the European context where most of the charging equipment is designed for 11 kW charging, the technical feasibility of the method, in the long run, is debatable. As electricity demand strongly correlates with energy prices, a linear optimization algorithm is proposed to minimize charging costs, which uses next-day market prices as the grid-induced incentive function under the assumption of perfect user predictions. The constraints on the state of charge guarantee the energy required for driving is delivered without failure. An average energy cost saving of 30% is realized at all penetrations. Nevertheless, the avalanche effect due to simultaneous charging during low price periods introduces new power peaks exceeding those of uncontrolled charging. This obstructs the grid-friendly integration of electric vehicles.
In the era of digital transformation an evolution takes place. Following this, new perspectives concerning leadership are required, especially in virtual teams. Shared Leadership is a promising leadership form to meet the challenges in a virtual team setting. Particularly, studies show that shared leadership increases performance, team creativity and innovative behavior. Moreover, the responsibility is distributed among several, not one individual. Nevertheless, it is unclear, which skills are needed in shared leadership teams and how they could be trained. Therefore, we develop a conceptual framework to pave the way for an empirical inquiry of the skills for and the role of shared leadership. Moreover, we encourage the discussion, whether the current leadership development is still viable and offer practical implications to develop shared leadership.
A model is presented that allows for the calculation of the success probability by which a vanilla Evolution Strategy converges to the global optimizer of the Rastrigin test function. As a result a population size scaling formula will be derived that allows for an estimation of the population size needed to ensure a high convergence security depending on the search space dimensionality.
The thorny issue of time
(2023)
Digital twin as enabler of business model innovation for infrastructure construction projects
(2023)
Emerging technologies and methods are becoming an important element of the construction industry. Digital Twins are used as a base to store data in BIM models and make use out of the data respectively make the data visible. The transparency in all phases of the lifecycle of building and infrastructure assets is crucial in order to get a more efficient lifecycle of planning, construction and maintenance. Whereas other industries increased performance in these phases by making use out of the data, construction industry is stuck in traditional methods and business models. In this paper we propose a concept that focuses on the digital production twin. The comparison of planning data with As-Is production data can empower a data driven continuous improvement process and support the decision making process of future innovations and suitable business models. This paper outlines the possibility to use the data stored in a digital twin with regards to the evaluation of possible business models.
Through mandatory ESG (environmental, social, governance) reporting large companies must disclose their ESG activities showing how sustainability risks are incorporated in their decision-making and production processes. This disclosure obligation, however, does not apply to small and medium-sized enterprises (SME), creating a gap in the ESG dataset. Banks are therefore required to collect sustainability data of their SME customers independently to ensure complete ESG integration in the risk analysis process for loans. In this paper, we examine ESG risk analysis through a smart science approach laying the focus on possible value outcomes of sustainable smart services for banks as well as for their (SME) customers. The paper describes ESG factors, how services can be derived from them, targeted metrics of ESG and an ESG Service Creation Framework (business ecosystem building, process model, and value creation). The description of an exemplary use case highlighting the necessary ecosystem for service creation as well as the created value concludes the paper.
The role of entrepreneurs and intrapreneurs in the current zeitgeist is to drive innovation, re-shape rigid, established processes in business as well as for consumers. They use new viewpoints to pioneer new (business) models which focus on ‘smartness’ rather than the purely monetary and short-sighted models of yesteryear. Fostering and supporting the culture of this current zeitgeist is a mayor challenge for entre- and intrapreneurial support infrastructures, namely startup centres and innovation hubs of universities and other public institutions as well as innovation centres of private companies. Hereby, support may range from access to funding over provision of resources such as offices or computing hardware to coaching in the development of business ideas and strategic roadmaps for product and service deployment. In this paper, we focus on describing the status-quo of afore- mentioned support infrastructures in Vorarlberg and the Lake Constance region, then extend the scope to existing (international) approaches for aiding founders and inno- vators in the development of smart services. An analysis of success stories of the Vorarlberg startup centre ‘startupstube’ and other initiatives including their compar- ison to international counterparts builds the basis for a methodological framework for (service science) coaching in entre- and intrapreneurial support infrastructures. The paper is concluded by the description of a framework for choosing the right methods and tools to create service value in entre-/intrapreneurship based upon tested, proven know-how and for defining support infrastructure needs based upon pre-defined stakeholder and target groups as well as the (industry) sectors of the innovators.
In this paper, a 256-channel, 10-GHz arrayed waveguide gratings demultiplexer for ultra-dense wavelength division multiplexing was designed using an in-house developed tool called AWG-Parameters. The AWG demultiplexer was designed for a central wavelength of 1550 nm and the structure was simulated in PHASAR tool from Optiwave. Two different AWG designs were developed and the influence of the design parameters on the AWG performance was studied.
Design, simulation, and optimization of the 1×4 optical three-dimensional multimode interference splitter using IP-Dip polymer as a core and polydimethylsiloxane (PDMS) Sylgard 184 as a cladding is demonstrated. The splitter was simulated by using beam propagation method in BeamPROP simulation module of RSoft photonic tool and optimized for an operating wavelength of 1.55 μm . According to the minimum insertion loss, the dimensions of the splitter were optimized for a waveguide with a core size of 4×4 μm2 . The objective of the study is to create the design for fabrication by three-dimensional direct laser writing optical lithography.
Activation of heat pump flexibilities is a viable solution to support balancing the grid via Demand Side Management measures and fulfill the need for flexibility options. Aggregators as interface between prosumers, distribution system operators and balance responsible parties face the challenge due to data privacy and technical restrictions to transform prosumer information into aggregated available flexibility to enable trading thereof. Thereby, literature lacks a generic, applicable and widely accepted flexibility estimation method for heat pumps,which incorporates reduced sensor and system information, system- and demand-dependent behaviour. In this paper, we adapt and extend a method from literature, by incorporating domain knowledge to overcome reduced sensor and system information. We apply data of five real-world heat pump systems, distinguish operation modes, estimate power and energy flexibility of each single heat pump system, proof transferability of the method, and aggregate the flexibilities available to showcase a small HP pool as a proof of concept.
Open tracing tools
(2023)
Background: Coping with the rapid growing complexity in contemporary software architecture, tracing has become an increasingly critical practice and been adopted widely by software engineers. By adopting tracing tools, practitioners are able to monitor, debug, and optimize distributed software architectures easily. However, with excessive number of valid candidates, researchers and practitioners have a hard time finding and selecting the suitable tracing tools by systematically considering their features and advantages. Objective: To such a purpose, this paper aims to provide an overview of popular Open tracing tools via comparison. Methods: Herein, we first identified 30 tools in an objective, systematic, and reproducible manner adopting the Systematic Multivocal Literature Review protocol. Then, we characterized each tool looking at the 1) measured features, 2) popularity both in peer-reviewed literature and online media, and 3) benefits and issues. We used topic modeling and sentiment analysis to extract and summarize the benefits and issues. Specially, we adopted ChatGPT to support the topic interpretation. Results: As a result, this paper presents a systematic comparison amongst the selected tracing tools in terms of their features, popularity, benefits and issues. Conclusion: The result mainly shows that each tracing tool provides a unique combination of features with also different pros and cons. The contribution of this paper is to provide the practitioners better understanding of the tracing tools facilitating their adoption.
Hot water heat pumps are well suited for demand side management, as the heat pump market faced a rapid growth in the past years with the trend to decentralized domestic hot water use. Sales were accelerated through wants and needs of energy conservation, energy efficiency, and less restrictive rules regarding Legionella. While in literature the model predictive control potential for heat pumps is commonly shown in simulations, the share of experimental studies is relatively low. To this day, experimental studies considering solely domestic hot water use are not available. In this paper, the realistic achievable model predictive control potential of a hot water heat pump is compared to the standard hysteresis control, to provide an experimental proof. We show for the first time, how state-of-the-art approaches (model predictive control, system identification, live state estimation, and demand prediction) can be transferred from electric hot water heaters to hot water heat pumps, combined, and implemented into a real-world hot water heat pump setup. The optimization approach, embedded in a realistic experimental setting, leads to a decrease in electric energy demand and cost per unit electricity by approximately 12% and 14%, respectively. Further, an increase in efficiency by approximately 13% has been achieved.
Immersive educational spaces
(2023)
"If only we had had such opportunities to grasp history like this when I was young" – words by an almost 80-year-old woman holding an iPad on which both, the buildings in the background and a tower in the form of a virtual 3D object, appear within reach. To "grasp" history - what an apt use of this action-oriented word for an augmented reality application built on considerations of thinking and acting in history. This telling image emerged during the first test run of the app i.appear which will be the focus of this article's considerations on the use of immersive learning environments. The application i.appear has been used in the city of Dornbirn (Austria) for a year now to teach historical content through location-based augmented reality and other interactive and multimedia technologies. After a brief description of the potential of such applications, the epistemological structure of the hosting app i.appear and its functionality will be outlined. This article will focus on the “Baroque Master Builders” tour of the hosting app that was created and tested as part of the current research.
This paper presents design, simulation, and optimization of the three-dimensional 1×4 optical multimode interference splitter using IP-Dip polymer as a core and polydimethylsiloxane (PDMS) Sylgard 184 as a cladding. The splitter was simulated by using beam propagation method in BeamPROP simulation engine of RSoft photonic tool and optimized for an operating wavelength of 1.55 µm. According to the minimum insertion loss, the dimensions of the MMI coupler and the length of the whole MMI splitter structure were optimized applying a waveguide with a core size of 4×4 µm2. The objective of the study is to create a design for fabrication by three-dimensional direct laser writing optical lithography.
We present design of planar 16-channel, 100-GHz multi-mode polymer-based AWG. This AWG was designed for central wavelength of 1550 nm applying AWG-Parameters tool. The AWG structure was created and simulated in the commercial photonic tool PHASAR from Optiwave. Achieved transmission characteristics were evaluated by AWG-Analyzer tool. For the design, multi-mode waveguides having a cross-section of (4x4) µm2 were used. The simulated results show strong worsening of the transmission characteristics in comparison when using single-mode waveguides. Nevertheless, the transmitting channels are clearly separated. The reason for using thicker multi-mode waveguides in the design is possibility to fabricate the AWG structure on polymer basis using direct laser writing lithography.
Coupling is one of the most frequently mentioned metric in software systems. However, to measure logical coupling between microservices, runtime information is needed or the availability of service-log files to analyze the calls between services is required. This work presents our emerging results, in which we propose a metric to statically calculate logical coupling between microservices based on commits to versioning systems. We performed an initial validation of the proposed metric with a dataset containing 145 open-source microservices projects. The results illustrate how logical coupling affects every system and increases overtime. However, we did not find a correlation between the number of commits or the number of developers and the introduction of logical coupling. In future, we investigate why, how, and when logical coupling is introduced in a system.
In this work, we investigated the influence of different etch depths of the rib waveguides on the performance of SiN-based AWGs. For this purpose, an 8-channel 100 GHz AWG was designed for a center wavelength of 850 nm. The design parameters entered were calculated using the AWG-Parameters tool. The simulations were performed with a commercial photonic tool PHASAR from Optiwave. The simulated performance was evaluated using the AWG-Analyzer tool. For the AWG design, we used three identical rib waveguides with different etch depths to simulate possible etch imperfection. The simulations show the wavelength shift and degradation of the AWG performance.
Optoelectronic system based on photonic integrated circuits to miniaturize spectral domain OCT
(2023)
We present a miniaturized optical coherence tomography (OCT) setup based on photonic integrated circuits (PIC) for the 850 nm range. We designed a 512-channel arrayed waveguide grating (AWG) on a PIC for spectral domain OCT (SD-OCT) that is co-integrated with PIN-photodiodes and analog-to-digital-converters on one single chip. This image sensor is combined with all the necessary electronics to act as a camera. It is integrated into a fiber-based OCT system, achieving a sensitivity of >80dB and various samples are imaged. This optoelectronic system will allow building small and cost-effective OCT systems to monitor retinal diseases.
Semiconducting metal oxides are widely used for solar cells, poto-catalysis, bio-active materials and gas sensors. Besides the material properties of the used semiconductor,the specific surface topology of the sensor determines the device performance. We investigate the preparation and transfer suitable metals onto LIPPS structures on glass for gas sensing applications.
Deep etched structures in GaAs with high aspect ratio have promising applications in optoelectronics and MEMS devices. The key factors in their fabrication process are the choosing of proper mask material and etching conditions which results in high selectivity and an anisotropic etch profile with smooth sidewalls. In this work, we studied several types of mask materials (Al, Ni, Cr, SiO2) for deep reactive ion etching of GaAs using inductively coupled plasma system. Thus, several sets of experiments were performed with varying gas mixture, pressure and ICP/RF power. As a result, we find optimized conditions and minimal thickness of mask material for achieving deep etched (>140 m) GaAs structures.
Various carbon (nano-) forms, so-called allotropes, have become one of the most supporting activities in fundamental and applied research trends. Therefore, a universal deposition process capable of “adjusting” system parameters in one “deposition chamber” is highly demanding. Here, we present a low-pressure large area deposition system combining radiofrequency (RF) and microwave (MW) plasma in one chamber in different configurations, which offers a wide deposition window for the growth of sp2 carbon (carbon nanotubes, amorphous carbon), a mixture of sp2 and sp3 (diamond-like films) and pure sp3 carbon represented by diamond films. We will show that not only the type of plasma source (RF vs. MW) but also the gas mixture and plasma chemistry are crucial parameters for the controllable and reproducible growth of these allotropes at temperatures from 250 to 800 °C.
The properties of SiC and diamond make them attractive materials for MEMS and sensor devices. We innovated specific laser ablation techniques to fabricate membranes and cantilevers made of SiC or nano-(micro-) crystalline diamond films grown on Si/SiO2 substrates by microwave chemical vapour deposition (MWCVD). We started research to generate surface moulds to grow corrugated diamond films for membranes and cantilevers. A software tool was developed to support the design of micromechanical cantilevers. We can measure deformation and resonant frequency of diamond cantilevers and identify the global mechanical properties. A benchmark against finite element simulations enables an inverse identification of the specific system parameters and simplifies the characterization procedure.
The properties of diamond make it an attractive material for MEMS and sensor devices. We present the feasibility to fabricate membranes and cantilevers made of nano-(micro-) crystalline diamond films grown on Si/SiO2 substrates using microwave chemical vapour deposition (MWCVD). The patterning of micromechanical structures was performed by a combined process of femtosecond laser ablation and wet etching. We designed cantilever structures with varying lengths and widths (25, 50, 100, 200 and 300 μm). The cantilevers were made in a symmetric left- and right-hand configuration. An additional laser treatment was used to modify the mechanical properties of the left-hand cantilever. The deflection of the laser-treated, and non-treated sections was measured. The global mechanical system properties were simulated and corresponded with high accuracy to the measured results of deflection.
Purpose: Although there is an apparent potential in using data for advanced services in manufacturing environments, SMEs are reluctant to share data with their ecosystem partners, which prevents them from leveraging this potential. Therefore, the purpose of this paper is to analyse the reasons behind these resistances. The argumentation paves the way for elaborating countermeasures that are adequate for the specific situation and the typical capabilities of SMEs.
Design/Methodology/Approach: The analysis is based on literature research and in-depth interviews with management representatives of 15 companies in manufacturing service ecosystems. Half of these are manufacturers and the other half technology or service providers for manufacturers. They are SMEs or partly larger companies operating in structures that are typical for SMEs.
Findings: Data sharing hurdles are investigated in the five dimensions, 1. quantifying the value of data, 2. willingness to share data and trust, 3. organizational culture and mindset, 4. legal aspects, and 5. security and privacy. The ability to quantify the value of data is a necessary but not sufficient precondition for data sharing, which must be enabled by adequate measures in the other four dimensions.
Originality/Value: The findings of this empirical study and the solution approach provide an SME-specific framework to analyze hurdles that must be overcome for sharing data in an ecosystem.
Manufacturing SMEs can apply the framework to overcome the hurdles by specific insights and solution approaches. Furthermore, the analysis illustrates the future research direction of the project towards a comprehensive solution approach for data sharing in a manufacturing ecosystem.
The design and development of smart products and services with data science enabled solutions forms a core topic of the current trend of digitalisation in industry. Enabling skilled staff, employees, and students to use data science in their daily work routine of designing such products and services is a key concern of higher education institutions, including universities, company workshop providers and in further education. The scope and usage scenario of this paper is to assess software modules (‘tools’) for integrated data and analytics as service (DAaaS). The tools are usually driven by machine learning, may be deployed in cloud infrastructures, and are specifically targeted at particular needs of the industrial manufacturing, production, or supply chain sector.
The paper describes existing theories and previous work, namely methods used in didactics, work done for visually designing and using machine learning algorithms (no-code / low- code tools), as well as combinations of these two topics. For tools available on the market, an extended assessment of their suitability for a set of learning scenarios and personas is discussed.
Smart services disrupt business models and have the potential to stimulate the circular economy transition of regions, enabling an environmentally friendly atmosphere for sustainable and innovation-driven growth of regions. Although smart services are powerful means for deploying circular economy goals in industrial practices, there is little systematic guidance on how the adoption of smart services could improve resource efficiency and stimulate smart regional innovation-driven growth, enabled through circular design. Implemented in the scope of Vorarlberg’s smart specialization strategy, this paper contributes to the literature on the circular economy and regional innovation-driven growth by assessing critical factors of the value creation and value capture implemented within the scope of the quadruple helix system. By identifying the main challenges and opportunities of collaborative value creation and value capture in setting-up smart circular economy strategies and by assessing the role of innovation actors within the quadruple helix innovation system, the study provides recommendations and set of guidelines for managers and public authorities in managing circular transition. Finally, based on the analysis of the role of actors in creating shared value and scaling-up smart circular economy practices in the quadruple helix innovation systems, the paper investigates the role of banks as enablers of circular economy innovation-driven regional growth and smart value creation.
Small and medium-sized enterprises often face resource deficits and there- fore depend on cooperating with other actors to stay innovative in a competitive environment. Establishing and maintaining actual co-creation and service inter- action strategies however is challenging. A reason for this is the complexity of finding methodologies and tools to create valuable outcome and the lack of knowledge of collaboration toolsets, also in virtual environments. This paper introduces an Innovation-Method-Framework consisting of innovation methods for increased service interaction and value co-creation among service stakeholders. Also, toolsets for the framework’s practical application are provided.
Recent developments in the area of Natural Language Processing (NLP) increasingly allow for the extension of such techniques to hitherto unidentified areas of application. This paper deals with the application of state-of-the-art NLP techniques to the domain of Product Safety Risk Assessment (PSRA). PSRA is concerned with the quantification of the risks a user is exposed to during product use. The use case arises from an important process of maintaining due diligence towards the customers of the company OMICRON electronics GmbH.
The paper proposes an approach to evaluate the consistency of human-made risk assessments that are proposed by potentially changing expert panels. Along the stages of this NLP-based approach, multiple insights into the PSRA process allow for an improved understanding of the related risk distribution within the product portfolio of the company. The findings aim at making the current process more transparent as well as at automating repetitive tasks. The results of this paper can be regarded as a first step to support domain experts in the risk assessment process.
The production of liquid-gas dispersions places high demands on the process technology, which requires knowledge of the bubble formation mechanisms, as well as the phase parameters of the media combinations used. To obtain the bubble sizes introduced to a flow not knowing the phase parameters, different process parameters are investigated. Their quality and applicability are evaluated. The results obtained make it possible to simplify long design processes of dispersion processes in manufacturing plants and to ensure the product quality of the products manufactured, by reducing waste.
In previous studies of linear rotary systems with active magnetic bearings, parametric excitation was introduced as an open-loop control strategy. The parametric excitation was realized by a periodic, in-phase variation of the bearing stiffness. At the difference between two of the eigenfrequencies of the system, a stabilizing effect, called anti-resonance, was found numerically and validated in experiments. In this work, preliminary results of further exploration of the parametric excitation are shared. A Jeffcott rotor with two active magnetic bearings and a disk is investigated. Using Floquet theory, a deeper insight into the dynamic behavior of the system is obtained. Aiming at a further increase of stability, a phase difference between excitation terms is introduced.
Bachground: Worldwide, more than 79.5 million people are forcibly displaced, including a significant number of migrant and refugee families with children. Migration and refugeedom affect these families in different dimensions, such as mental, physical and spiritual health. Identifying family needs and enhancing parenting skills can improve family cohesion and health, as well as smooth integration into the host country. This review is part of the Erasmus+ funded project- IENE 8 (Intercultural Education for Nurses in Europe) aiming at empowering migrant and refugee families regarding parenting skills.
Methods: This was a scoping review of literature. The IENE 8 partner countries (Cyprus, Germany, Greece, Italy, Romania, and United Kingdom) searched for peer reviewed papers, grey literature and mass media reports at international, European and national level. The time period for the search of scientific and grey literature was between2013-2018, and for mass media, it was between 2016 and 2018. Results: 124 relevant sources were identified. They included 33 Peer reviewed papers, 47 Grey literature documents and 44 mass media reports. This revealed the importance of understanding the needs of migrant families with children. Conclusion: It is evident from the literature that there is a need to support refugee parents to adjust their existing skill and to empower them to develop new ones. Healthcare and social services professionals have an essential role in improving the refugees' parenting skills. This can be done by developing and implementing family-centered and culturally-sensitive intervention programs.
Highly-sensitive single-step sensing of levodopa by swellable microneedle-mounted nanogap sensors
(2022)
Microneedle (MN) sensing of biomarkers in interstitial fluid (ISF) can overcome the challenges of self-diagnosis of diseases by a patient, such as blood sampling, handling, and measurement analysis. However, the MN sensing technologies still suffer from poor measurement accuracy due to the small amount of target molecules present in ISF, and require multiple steps of ISF extraction, ISF isolation from MN, and measurement with additional equipment. Here, we present a swellable MN-mounted nanogap sensor that can be inserted into the skin tissue, absorb ISF rapidly, and measure biomarkers in situ by amplifying the measurement signals by redox cycling in nanogap electrodes. We demonstrate that the MN-nanogap sensor measures levodopa (LDA), medication for Parkinson disease, down to 100 nM in an aqueous solution, and 1 μM in both the skin-mimicked gelatin phantom and porcine skin.
The paper shows concepts of optical splitting based on three dimensional (3D) optical splitters based on multimode interference principle. This paper is focused on the design, fabrication and characterization of 3D MMI splitter with formed output waveguides based on IP-Dip polymer for direct application on optical fiber. The MMI optical splitter was simulated and fabricated using direct laser writing process. Output characteristics were characterized by highly resolved near-field scanning optical microscope (NSOM) and compared with 3D MMI splitter without output waveguides.
We present 256-channel, 25-GHz AWG designed for ultra-dense wavelength division multiplexing. For the design two in-house developed tools were used: AWG-Parameters tool for the calculation of input design parameters and AWGAnalyser tool, used to evaluate the simulated transmission characteristics. The AWG structure was designed for AWG central wavelength of 1550 nm and simulated with PHASAR tool from Optiwave. To keep the size of AWG structure as small as possible the number of waveguides in the phased array was tested. The simulations show that there is a certain minimum number of phased array waveguides necessary to reach sufficient AWG performance. After optimization, the AWG structure reached 10 cm x 11 cm in size and satisfying optical properties.
This paper describes two different designs of 1×8 passive optical splitters. The first splitter consists of cascade arranged directional waveguide branches (Y-branch splitter) with (0.8×0.16) µm2 waveguide cross-section. The second splitter is based on multimode interference occurring in a large MMI coupler, which uses a self-imaging effect for beam propagation, exhibiting the same waveguide core size as a Y-branch splitter. The waveguide channel profile, used in both approaches, is based on a silicon nitride material platform, with a refractive index of core being nc = 1.925 and a refractive index of cladding ncl = 1.4575. The splitters are designed as a planar structure for a medical operating wavelength 850 nm. Design, simulation, and optimization of passive optical components are performed by a commercial photonic software tool BeamPROP simulation engine by RSoft Photonics Suite tool, employing beam propagation method. This work aims to find the minimum physical dimensions of the designed splitters with the satisfactory optical performance. According to the minimum insertion loss and minimum non-uniformity, the optimum length of the splitters is determined. Finally, the optical properties of splitters for both approaches are discussed and compared with each other.
Calls for decolonising global health have intensified in recent years. The Austrian NGO plan:g Partnership for Global Health has taken several steps to decolonise its work and to find new ways of communicating and engaging in equitable partnerships. Decolonising global health cooperation is however not without its challenges.
Due to the increasing trend of photonic element miniaturisation and the need for optical splitting, we propose and simulate a new type of three-dimensional (3D) optical splitter based on multimode interference (MMI) for the wavelength of 1550 nm. We present various designs and simulations of various parameters for the optimized MMI splitter. We focus on the possibility of its integration on an optical fiber. The design is focused on a possible production process using 3D laser lithography for the prepared experiments. The MMI splitter was prepared by laser lithography using direct writing process and finally investigated by output characterisation by the near-field measurement.
In this paper, we document optical splitters based on Y-branch and also on MMI splitting principle. The 1×4 Y-branch splitter was prepared in 3D geometry fully from polymer approaching the single mode transmission at 1550 nm. We also prepared new concept of 1×4 MMI optical splitter. Their optical properties and character of output optical field were measured by near-field scanning optical microscope. Splitting properties and optical outputs of both splitters are very promising and increase an attractiveness of presented 3D technology and polymers.
The paper deals with the optimization of 2x2 optical switch for photonic integrated circuits based on two 2x2 MMI splitters and two phase-modulators. The optical switch was modelled in the RSoftCAD with the simulation tool BeamPROP. The optimization was done to minimise the insertion losses and broaden the spectral band at 1550 nm by using linear tapers in a 2x2 MMI splitter topology. The 2x2 optical switch is a common element for creating more complex 1xN or NxN optical switches in all-optical signal processing.
Design, simulation, and optimization of the 1×4 optical three-dimensional multimode interference splitter using IP-Dip polymer as a core and polydimethylsiloxane (PDMS) Sylgard 184 as a cladding is demonstrated. The splitter was simulated by using beam propagation method in BeamPROP simulation module of RSoft photonic tool and optimized for an operating wavelength of 1.55 μm . According to the minimum insertion loss, the dimensions of the splitter were optimized for a waveguide with a core size of 4×4 μm2 . The objective of the study is to create the design for fabrication by three-dimensional direct laser writing optical lithography.
In this paper we present various educational activities with Photonics Explorer, an educational kit developed by the photonics research team B - PHOT at VUB (Vrije Universiteit Brussel) for students at secondary schools. The concept is a ‘lab-in-a-box’ that enables students of the 2 nd and 3 rd grade to do photonics experiments themselves at school with lasers, LEDs, lenses, optical fibers, and other high-tech components. Even though, the kit was developed for the secondary schools, we use experiments from the kit also for some other teaching activities such as lectures at the university, photonics workshops for teachers and children at primary/secondary schools or for events such as children's/youth's university or the night of sciences. In the frame of Austrian based project Phorsch! we have organized most of these activities which will be presented here.
A new software tool, called AWG-Channel-Spacing, is developed to calculate accurate channel spacing of an arrayed waveguide gratings (AWG) optical multiplexer/demultiplexer. This tool has been developed with the application framework QT in the programming language C++. The tool was evaluated with a design of 20-channel 200 GHz AWG. The achieved simulated transmission characteristics prove the correct functionality of the tool.
A new software tool, called AWG-Wuckler, is developed to calculate geometric parameters of arrayed waveguide grating structures for telecommunication and medical applications. These parameters are crucial for a AWG layout which will be created and simulated using commercial photonic design tools. The design process of AWG is very complex because its geometric dimensions depend on a large number of input design parameters and other input design parameters. Often geometric constraints require an adjustment of the input design parameters and vice versa. Calculation and adjustment of the geometric parameters is a time-consuming process that is currently not fully supported by any commercial photonic tool. AWG-Wuckler tool overcomes this issue and offers a fast and easy to use solution. The tool was already applied in various AWG designs and is technologically well proven.
This paper aims to study the design, simulation, and optimization of low-loss Y-branch passive optical splitters up to 64 output ports for telecommunication applications. For a waveguide channel profile, the standard material silica-on-silicon is used. The Y-splitters are designed and simulated at telecommunication operating wavelength, λ = 1550 nm. Except for the lengths of the used Y-branches, and a core size of the waveguides, design parameters such as port pitch between the waveguides and simulation parameters for all splitters are considered fixed. The simulation results are analyzed to determine the optimum length of the splitters and the optimum core size. Based on this optimization the total length of the highest designed 1×64 Y-branch splitter was reduced by 41.14 % for a waveguide core (5×5) μm2 compared to the length of splitter with a standard (6×6) μm2 core size.
Vast amounts of oily wastewater are byproducts of the petrochemical and the shipping industry and to this day frequently discharged into water bodies either without or after insufficient treatment. To alleviate the resulting pollution, water treatment processes are in great demand. Bubble column humidifiers (BCHs) as part of humidification–dehumidification systems are predestined for such a task, since they are insensitive to different feed liquids, simple in design and have low maintenance requirements. While humidification in a bubble column has been investigated plentiful for desalination, a systematic investigation of oily wastewater treatment is missing in literature. We filled this gap by analyzing the treatment of an oil–water emulsion experimentally to derive recommendations for future design and operation of BCHs. Our humidity measurements indicate that the air stream is always saturated after humidification for a liquid height of only 10 cm. A residual water mass fraction of 3.5 wt% is measured after a batch run of six hours. Furthermore, continuous measurements show that an increase in oil mass fraction leads to a decrease in system productivity especially for high oil mass fractions. This decrease is caused by the heterogeneity of the liquid temperature profile. A lower liquid height mitigates this heterogeneity, therefore decreasing the heat demand and improving the overall efficiency. The oil content of the produced condensate is below 15 ppm, allowing discharge into various water bodies. The results of our systematic investigation prove suitability and indicate a strong future potential for the use of BCHs in oily wastewater treatment.
Industrial demand side management has shown significant potential to increase the efficiency of industrial energy systems via flexibility management by model-driven optimization methods. We propose a grey-box model of an industrial food processing plant. The model relies on physical and process knowledge and mass and energy balances. The model parameters are estimated using a predictive error method. Optimization methods are applied to separately reduce the total energy consumption, total energy costs and the peak electricity demand of the plant. A viable potential for demand side management in the plant is identified by increasing the energy efficiency, shifting cooling power to low price periods or by peak load reduction.
A trend from centralized to decentralized production is emerging in the manufacturing domain leading to new and innovative approaches for long-established production methods. A technology supporting this trend is Cloud Manufacturing, which adapts technologies and concepts known from cloud computing to the manufacturing domain. A core aspect of Cloud Manufacturing is representing knowledge about manufacturing, e.g., machine capabilities, in a suitable form. This knowledge representation should be flexible and adaptable so that it fits across various manufacturing domains, but, at the same time, should also be specific and exhaustive. We identify three core capabilities that such a platform has to support, i.e., the product, the process and the production.We propose representing this knowledge in semantically specified knowledge graphs, essentially creating three through features interconnected ontologies each representing a facet of manufacturing. Finally, we present an exemplary implementation of a Cloud Manufacturing platform using this representation and its advantages.
To create a map of an unknown area, autonomous robots must follow a strategy to explore the area without knowing the optimal paths to reduce the time needed to map the whole area. To reduce the time to accomplish this task, multiple robots can work together to create a map in a more efficient way. However, without proper coordination, the time a team of autonomous robots needs to explore the unknown area can exceed the time needed by a single robot. To counteract the challenges, a shared infrastructure is needed which extracts useful information for the individual robots out of the shared information of all robots so the exploration can be coordinated. These measures introduce new challenges to the system, concerning the load of the communication infrastructure as well as the overall task of exploring and mapping becoming dependent on the correct communication and robustness of the shared team infrastructure. Therefore, the amount of communication and dependency of each individual robot of the rest of the other robots of the team must be reduced to ensure that the robots can continue working even if the communication with the shared infrastructure fails.
Bubble column humidifiers (BCHs) are frequently used for the humidification of air in various water treatment applications. A potential but not yet profoundly investigated application of such devices is the treatment of oily wastewater. To evaluate this application, the accumulation of an oil-water emulsion using a BCH is experimentally analyzed. The amount of evaporating water vapor can be evaluated by measuring the humidity ratio of the outlet air. However, humidity measurements are difficult in close to saturated conditions, as the formation of liquid droplets on the sensor impacts the measurement accuracy. We use a heating section after the humidifier, such that no liquid droplets are formed on the sensor. This enables us a more accurate humidity measurement. Two batch measurement runs are conducted with (1) tap water and (2) an oil-water emulsion as the respective liquid phase. The humidity measurement in high humidity conditions is highly accurate with an error margin of below 3 % and can be used to predict the oil concentration of the remaining liquid during operation. The measured humidity ratio corresponds with the removed amount of water vapor for both tap water and the accumulation of an oil-water emulsion. Our measurements show that the residual water content
in the oil-water emulsion is below 4 %.
Grid-scale electrical energy storage (EES) is a key component in cost-effective transition scenarios to renewable energy sources. The requirement of scalability favors EES approaches such as pumped-storage hydroelectricity (PSH) or compressed-air energy storage (CAES), which utilize the cheap and abundant storage materials water and air, respectively. To overcome the site restriction and low volumetric energy densities attributed to PSH and CAES, liquid-air energy storage (LAES) has been devised; however, it suffers from a rather small round-trip efficiency (RTE) and challenging storage conditions. Aiming to overcome these drawbacks, a novel system for EES is developed using solidified air (i.e., clathrate hydrate of air) as the storable phase of air. A reference plant for solidified-air energy storage (SAES) is conceptualized and modeled thermodynamically using the software CoolProp for water and air as well as empirical data and first-order approximations for the solidified air (SA). The reference plant exhibits a RTE of 52% and a volumetric storage density of 47 kWh per m3 of SA. While this energy density relates to only one half of that in LAES plants, the modeled RTE of SAES is comparable already. Since improved thermal management and the use of thermodynamic promoters can further increase the RTEs in SAES, the technical potential of SAES is in place already. Yet, for a successful implementation of the concept - in addition to economic aspects - questions regarding the stability of SA must be first clarified and challenges related to the processing of SA resolved.
As the boundary between real and virtual life is becoming increasingly blurred, researchers and practitioners are looking for ways to integrate the two intending to improve human lives in a plethora of domains. A cutting-edge concept is the design of Digital Twins (DT), having a broad range of implications and applications, spanning from education, training, as well as safety and productivity in the workplace. An emergent approach for implementing DTs is the usage of mixed reality (MR) and augmented reality (AR), which are well aligned with merging real and virtual objects to enhance the human’s ability to interact with and manage DTs. Yet, this is still a novel area of research and, as such, a grounded understanding of the current state, challenges, and open questions is still lacking. Towards this, we conducted a PRISMA-based literature review of scientific articles and book chapters dealing with the use of MR and AR for digital twins. After a thorough screening phase and eligibility check, 25 papers were analyzed, sorted and compared by different categories like research topic (e.g., visualization, guidance), domain (e.g., manufacturing, education), paper type (e.g., design study, evaluation), evaluation type (user study, case study or none), used hardware (e.g., Microsoft HoloLens, mobile devices) as well as the different outcomes (result type and topic, problems, outlook). The major finding of this research survey is the predominant focus of the reviewed papers on the technology itself and the neglect of factors regarding the users. We, therefore, encourage researchers in this area to keep the importance of ease and joy of use in mind and include users in multiple stages of their work.
PV hosting capacity provides utilities the knowledge of the maximum amount of solar installations possible to accommodate in low voltage grids such that no operational problems arise. As the quantification of the hosting capacity requires data collection, grid modelling, and often time-consuming simulations, simplified estimations for large-scale applications are of interest. In this paper, Bayesian statistical inference is applied to estimate the hosting capacities of more than 5000 real feeders in Austria. The results show that the hosting capacity of 95% of the total feeders can be estimated with a mean error below 20% by only having knowledge of a random sample of 5%. Moreover, the hosting capacity estimation at a regional level shows a maximum error below 9%, also relying on a random sample of 5% of the total feeders. Furthermore, the approach proposed provides a methodology to assess new parameters aiming to improve the accuracy of the hosting capacity estimation at a feeder level.
The impact of global warming and climate change has forced countries to introduce strict policies and decarbonization goals toward sustainable development. To achieve the decarbonization of the economy, a substantial increase of renewable energy sources is required to meed energy demand and to transition away from fossil fuels. However, renewables are sensitive to environmental conditions, which may lead to imbalances between energy supply and demand. Battery energy storage systems are gaining more attention for balancing energy systems in existing grid networks at various levels such as bulk power management, transmission and distribution, and for end-users. Integrating battery energy storage systems with renewables can also solve reliability issues related to transient energy production and be used as a buffer source for electrical vehicle fast charging. Despite these advantages, batteries are still expensive and typically built for a single application – either for an energy- or power-dense application – which limits economic feasibility and flexibility. This paper presents a theoretical approach of a hybrid energy storage system that utilizes both energy- and power-dense batteries serving multiple grid applications. The proposed system will employ second use electrical vehicle batteries in order to maximise the potential of battery waste. The approach is based on a survey of battery modelling techniques and control methods. It was found that equivalent circuit models as well as unified control methods are best suited for modelling hybrid energy storages for grid applications. This approach for hybrid modelling is intended to help accelerate the renewable energy transition by providing reliable energy storage.