Refine
Year of publication
Document Type
- Conference Proceeding (308)
- Article (288)
- Master's Thesis (113)
- Part of a Book (53)
- Book (19)
- Doctoral Thesis (9)
- Report (6)
- Preprint (5)
- Working Paper (4)
- Other (3)
Institute
- Forschungszentrum Mikrotechnik (247)
- Forschungszentrum Business Informatics (149)
- Technik | Engineering & Technology (125)
- Department of Computer Science (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (112)
- Wirtschaft (106)
- Forschungszentrum Energie (79)
- Didaktik (mit 31.03.2021 aufgelöst; Integration ins TELL Center) (37)
- Forschungszentrum Human Centred Technologies (35)
- Soziales & Gesundheit (34)
- Josef Ressel Zentrum für Materialbearbeitung (27)
Language
- English (815) (remove)
Keywords
- Laser ablation (11)
- Y-branch splitter (11)
- arrayed waveguide gratings (11)
- photonics (8)
- Evolution strategy (7)
- Demand side management (6)
- Optimization (6)
- integrated optics (6)
- AWG (5)
- Arrayed waveguide gratings (5)
The utilization of lasers in dentistry expands greatly in recent years. For instance, fs-lasers are effective for both drilling and caries prevention, while cw-lasers are useful for adhesive hardening. A cutting-edge application of lasers in dentistry is the debonding of veneers. While there are pre-existing tools for this purpose, there is still potential for improvement. Initial efforts to investigate laser assisted debonding mechanisms with measurements of the optical and mechanical properties of teeth and prosthetic ceramics are presented. Preliminary tests conducted with a laser system used for debonding that is commercially available showed differences in the output power set at the systems console to that at specified distances from the handpiece. Furthermore, the optical properties of the samples (human teeth and ceramics) were characterised. The optical properties of the ceramics should closely resemble those of teeth in terms of look and feel, but they also influence the laser assisted debonding technique and thus must be taken into account. In addition first attempts were performed to investigate the mechanical properties of the samples by means of pump-probe-elastography under a microscope. By analyzing the sample surface up to 20 ns after a fs-laser pulse impact, pressure and shock waves could be detected, which can be utilized to determine the elastic constants of specific materials. Together such investigations are needed to shape the basis for a purely optical approach of debonding of veneers utilizing acoustic waves.
This paper presents a project developed at the K.S.Rangasamy College of Technology (Tamilnadu,India) aimed at designing, implementing, and testing an autonomous multipurpose vehicle with safe, efficient, and economic operation. This autonomous vehicle moves through the crop lines of a Agricultural land and performs tasks that are tedious and/or hazardous to the farmers. First, it has been equipped for spraying, but other configurations have also been designed, such as: a seeding ,plug platform to reach the top part of the plants to perform different tasks (pruning, harvesting, etc.), and a trailer to transport the fruits, plants, and crop waste.
Modern portable electronic devices have seen component heat load increasing, while the space available for heat dissipation has decreased. This requires the thermal management system to be optimized to attain the high performance heat sink. Heat sinks plays a major role for dissipating heat in electronic devices. Phase change material (PCM) is used to enhance the heat dissipation in heat sink. This paper reports the results of an experimental investigation of the performance of Pin fin heat sinks filled with phase change materials for thermal management of electronic devices. The experimental set ups are prepared with the graphical programming language with Lab VIEW (Laboratory Virtual Instruments for Engineering Workbench. Three different types of Pin fin Heat sink with and without PCM are investigated based on different operational timings and the temperature is acquired with the help of Data Acquisition Card (DAQ). The results indicated that the inclusion of the PCM could stabilize the temperature for a longer period and reduce the heating rates and peak temperatures of heat sink with increasing the number of fins can enhance the thermal performance of electronic devices.
Signatures of the optical stark effect on entangled photon pairs from resonantly-pumped quantum dots
(2023)
Two-photon resonant excitation of the biexciton-exciton cascade in a quantum dot generates highly polarization-entangled photon pairs in a near-deterministic way. However, the ultimate level of achievable entanglement is still debated. Here, we observe the impact of the laser-induced ac-Stark effect on the quantum dot emission spectra and on entanglement. For increasing pulse-duration-to-lifetime ratios and pump powers, decreasing values of concurrence are recorded. Nonetheless, additional contributions are still required to fully account for the observed below-unity concurrence.
Strain-induced dynamic control over the population of quantum emitters in two-dimensional materials
(2023)
The discovery of quantum emitters in two-dimensional materials has triggered a surge of research to assess their suitability for quantum photonics. While their microscopic origin is still the subject of intense studies, ordered arrays of quantum emitters are routinely fabricated using static strain-gradients, which are used to drive excitons toward localized regions of the 2D crystals where quantum-light-emission takes place. However, the possibility of using strain in a dynamic fashion to control the appearance of individual quantum emitters has never been explored so far. In this work, we tackle this challenge by introducing a novel hybrid semiconductor-piezoelectric device in which WSe2 monolayers are integrated onto piezoelectric pillars delivering both static and dynamic strains. Static strains are first used to induce the formation of quantum emitters, whose emission shows photon anti-bunching. Their excitonic population and emission energy are then reversibly controlled via the application of a voltage to the piezoelectric pillar. Numerical simulations combined with drift-diffusion equations show that these effects are due to a strain-induced modification of the confining-potential landscape, which in turn leads to a net redistribution of excitons among the different quantum emitters. Our work provides relevant insights into the role of strain in the formation of quantum emitters in 2D materials and suggests a method to switch them on and off on demand.
Synthetic polymers, such as polyamide (PA), inherently possess a moderate number of surface functionalities compared to natural polymers, which negatively impacts the uniformity of metallic coatings obtained through wet-chemical methods like electroless plating. The paper presents the use of a siloxane interlayer formed from the condensation of the hydrolyzed 3-triethoxysilylpropyl succinic anhydride (TESPSA) precursor as a strategy to modify the surface properties of polyamide 6.6 (PA66) fabrics and improve the uniformity of the copper surface coating. The application of the siloxane intermediate coating demonstrates a significant improvement in electrical conductivity, up to 20 times higher than fabrics without the interlayer. The morphology of the coatings was investigated using scanning electron (SEM) and laser confocal scanning microscopy (LSM). In addition, dye adsorption, flexural rigidity, air permeability and contact angle measurements were conducted to monitor the change in the PA66 properties after the siloxane functionalization.
Experimental multi-state quantum discrimination in the frequency domain with quantum dot light
(2022)
The quest for the realization of effective quantum state discrimination strategies is of great interest for quantum information technology, as well as for fundamental studies. Therefore, it is crucial to develop new and more efficient methods to implement discrimination protocols for quantum states. Among the others, single photon implementations are more advisable, because of their inherent security advantage in quantum communication scenarios. In this work, we present the experimental realization of a protocol employing a time-multiplexing strategy to optimally discriminate among eight non-orthogonal states, encoded in the four-dimensional Hilbert space spanning both the polarization degree of freedom and photon energy. The experiment, built on a custom-designed bulk optics analyser setup and single photons generated by a nearly deterministic solid-state source, represents a benchmarking example of minimum error discrimination with actual quantum states, requiring only linear optics and two photodetectors to be realized. Our work paves the way for more complex applications and delivers a novel approach towards high-dimensional quantum encoding and decoding operations.
A quantum-light source that delivers photons with a high brightness and a high degree of entanglement is fundamental for the development of efficient entanglement-based quantum-key distribution systems. Among all possible candidates, epitaxial quantum dots are currently emerging as one of the brightest sources of highly entangled photons. However, the optimization of both brightness and entanglement currently requires different technologies that are difficult to combine in a scalable manner. In this work, we overcome this challenge by developing a novel device consisting of a quantum dot embedded in a circular Bragg resonator, in turn, integrated onto a micromachined piezoelectric actuator. The resonator engineers the light-matter interaction to empower extraction efficiencies up to 0.69(4). Simultaneously, the actuator manipulates strain fields that tune the quantum dot for the generation of entangled photons with fidelities up to 0.96(1). This hybrid technology has the potential to overcome the limitations of the key rates that plague current approaches to entanglement-based quantum key distribution and entanglement-based quantum networks. Introduction
Beyond the Four-Level Model: Dark and Hot States in Quantum Dots Degrade Photonic Entanglement
(2023)
Entangled photon pairs are essential for a multitude of quantum photonic applications. To date, the best performing solid-state quantum emitters of entangled photons are semiconductor quantum dots operated around liquid-helium temperatures. To favor the widespread deployment of these sources, it is important to explore and understand their behavior at temperatures accessible with compact Stirling coolers. Here we study the polarization entanglement among photon pairs from the biexciton–exciton cascade in GaAs quantum dots at temperatures up to ∼65 K. We observe entanglement degradation accompanied by changes in decay dynamics, which we ascribe to thermal population and depopulation of hot and dark states in addition to the four levels relevant for photon pair generation. Detailed calculations considering the presence and characteristics of the additional states and phonon-assisted transitions support the interpretation. We expect these results to guide the optimization of quantum dots as sources of highly entangled photons at elevated temperatures.
In this paper, we consider the question of data aggregation using the practical example of emissions data for economic activities for the sustainability assessment of regional bank clients. Given the current scarcity of company-specific emission data, an approximation relies on using available public data. These data are reported in different standards in different sources. To determine a mapping between the different standards, an adaptation to the Covariance Matrix Self-Adaptation Evolution Strategy is proposed. The obtained results show that high-quality mappings are found. Nevertheless, our approach is transferable to other data compatibility problems. These can be found in the merging of emissions data for other countries, or in bridging the gap between completely different data sets.
This study presents different approaches to increase the sensing area of NiO based semiconducting metal oxide gas sensors. Micro- and nanopatterned laser induced periodic surface structures (LIPSS) are generated on silicon and Si/SiO2 substrates. The surface morphologies of the fabricated samples are examined by FE SEM. We select the silicon samples with an intermediate Si3N4 layer due to its superior isolation quality over the thermal oxide for evaluating the hydrogen and acetone sensitivity of a NiO based test sensor.
Objectives: The MetabQoL 1.0 is the first disease-specific health related quality of life (HrQoL) questionnaire for patients with intoxication-type inherited metabolic disorders. Our aim was to assess the validity and reliability of the MetabQoL 1.0, and to investigate neuropsychiatric burden in our patient population. Methods: Data from 29 patients followed at a single center, aged between 8 and 18 years with the diagnosis of methylmalonic acidemia (MMA), propionic acidemia (PA) or isovaleric acidemia (IVA), and their parents were included. The Pediatric Quality of Life Inventory (PedsQoL) was used to evaluate the validity and reliability of MetabQoL 1.0.
Results: The MetabQoL 1.0 was shown to be valid and reliable (Cronbach's alpha: 0.64–0.9). Fourteen out of the 22 patients (63.6%) formally evaluated had neurological findings. Of note, 17 out of 20 patients (85%) had a psychiatric disorder when evaluated formally by a child and adolescent psychiatrist. The median mental scores of the MetabQoL 1.0 proxy report were significantly higher than those of the self report (p = 0.023). Patients with neonatal-onset disease had higher MetabQoL 1.0 proxy physical (p = 0.008), mental (p = 0.042), total scores (p = 0.022); and self report social (p = 0.007) and total scores (p = 0.043) than those with later onset disease.
Conclusions: This study continues to prove that the MetabQoL 1.0 is an effective tool to measure what matters in intoxication-type inherited metabolic disorders. Our results highlight the importance of clinical assessment complemented by patient reported outcomes which further expands the evaluation toolbox of inherited metabolic diseases.
Whether at the intramolecular or cellular scale in organisms, cell-cell adhesion adapt to external mechanical cues arising from the static environment of cells and from dynamic interactions between neighboring cells. Cell-cell adhesions need to resist detachment forces to secure the integrity and internal organization of organisms. In the past, various techniques have been developed to characterize adhesion properties of molecules and cells in vitro, and to understand how cells sense and probe their environment. Atomic force microscopy and dual-pipette aspiration, where cells are mainly present in suspension, are common methods for studying detachment forces of cell-cell adhesions. How cell-cell adhesion forces are developed for adherent and environment-adapted cells, however, is less clear. Here, we designed the Cell-Cell Separation Device (CC-SD), a microstructured substrate that measures both intercellular forces and external stresses of cells towards the matrix. The design is based on micropillar arrays originally designed for cell traction-force measurements. We designed PDMS micropillar-blocks, to which cells could adhere and be able to connect to each other across the gap. Controlled stretching of the whole substrate changed the distance between blocks and increased gap size. That allowed us to apply strains to cell-cell contacts, eventually leading to cell-cell adhesion detachment, which was measured by pillar deflections. The CC-SD provided an increase of the gap between the blocks of up to 2.4-fold, which was sufficient to separate substrate-attached cells with fully developed F-actin network. Simultaneously measured pillar deflections allowed us to address cellular response to the intercellular strain applied. The CC-SD thus opens up possibilities for the analysis of intercellular force detachments and sheds light on the robustness of cell-cell adhesions in dynamic processes in tissue development.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
Highly-sensitive single-step sensing of levodopa by swellable microneedle-mounted nanogap sensors
(2023)
Microneedle (MN) sensing of biomarkers in interstitial fluid (ISF) can overcome the challenges of self-diagnosis of diseases by a patient, such as blood sampling, handling, and measurement analysis. However, the MN sensing technologies still suffer from poor measurement accuracy due to the small amount of target molecules present in ISF, and require multiple steps of ISF extraction, ISF isolation from MN, and measurement with additional equipment. Here, we present a swellable MN-mounted nanogap sensor that can be inserted into the skin tissue, absorb ISF rapidly, and measure biomarkers in situ by amplifying the measurement signals by redox cycling in nanogap electrodes. We demonstrate that the MN-nanogap sensor measures levodopa (LDA), medication for Parkinson disease, down to 100 nM in an aqueous solution, and 1 μM in both the skin-mimicked gelatin phantom and porcine skin.
Organic acidurias (OAs), urea-cycle disorders (UCDs), and maple syrup urine disease (MSUD) belong to the category of intoxication-type inborn errors of metabolism (IT-IEM). Liver transplantation (LTx) is increasingly utilized in IT-IEM. However, its impact has been mainly focused on clinical outcome measures and rarely on health-related quality of life (HRQoL). Aim of the study was to investigate the impact of LTx on HrQoL in IT-IEMs. This single center prospective study involved 32 patients (15 OA, 11 UCD, 6 MSUD; median age at LTx 3.0 years, range 0.8–26.0). HRQoL was assessed pre/post transplantation by PedsQL-General Module 4.0 and by MetabQoL 1.0, a specifically designed tool for IT-IEM. PedsQL highlighted significant post-LTx improvements in total and physical functioning in both patients' and parents' scores. According to age at transplantation (≤3 vs. >3 years), younger patients showed higher post-LTx scores on Physical (p = 0.03), Social (p < 0.001), and Total (p =0.007) functioning. MetabQoL confirmed significant post-LTx changes in Total and Physical functioning in both patients and parents scores (p ≤ 0.009). Differently from PedsQL, MetabQoL Mental (patients p = 0.013, parents p = 0.03) and Social scores (patients p = 0.02, parents p = 0.012) were significantly higher post-LTx. Significant improvements (p = 0.001–0.04) were also detected both in self- and proxy-reports for almost all MetabQoL subscales. This study shows the importance of assessing the impact of transplantation on HrQoL, a meaningful outcome reflecting patients' wellbeing. LTx is associated with significant improvements of HrQol in both self- and parentreports. The comparison between PedsQL-GM and MetabQoL highlighted that MetabQoL demonstrated higher sensitivity in the assessment of diseasespecific domains than the generic PedsQL tool.
Long-Term outcome of infantile onset pompe disease patients treated with enzyme replacement therapy
(2024)
Background: Enzyme replacement therapy (ERT) with recombinant human alglucosidase alfa (rhGAA) was approved in Europe in 2006. Nevertheless, data on the long-term outcome of infantile onset Pompe disease (IOPD) patients at school age is still limited.
Objective: We analyzed in detail cardiac, respiratory, motor, and cognitive function of 15 German-speaking patients aged 7 and older who started ERT at a median age of 5 months.
Results: Starting dose was 20 mg/kg biweekly in 12 patients, 20 mg/kg weekly in 2, and 40 mg/kg weekly in one patient. CRIM-status was positive in 13 patients (86.7%) and negative or unknown in one patient each (6.7%). Three patients (20%) received immunomodulation. Median age at last assessment was 9.1 (7.0–19.5) years. At last follow-up 1 patient (6.7%) had mild cardiac hypertrophy, 6 (42.9%) had cardiac arrhythmias, and 7 (46.7%) required assisted ventilation. Seven patients (46.7%) achieved the ability to walk independently and 5 (33.3%) were still ambulatory at last follow-up. Six patients (40%) were able to sit without support, while the remaining 4 (26.7%) were tetraplegic. Eleven patients underwent cognitive testing (Culture Fair Intelligence Test), while 4 were unable to meet the requirements for cognitive testing. Intelligence quotients (IQs) ranged from normal (IQ 117, 102, 96, 94) in 4 patients (36.4%) to mild developmental delay (IQ 81) in one patient (9.1%) to intellectual disability (IQ 69, 63, 61, 3x < 55) in 6 patients (54.5%). White matter abnormalities were present in 10 out of 12 cerebral MRIs from 7 patients.
Measuring what matters
(2023)
Patient reported outcomes (PROs) are generally defined as ‘any report of the status of a patient's health condition that comes directly from the patient, without interpretation of the patient's response by a clinician or anyone else’. A broader definition of PRO also includes ‘any information on the outcomes of health care obtained directly from patients without modification by clinicians or other health care professionals’. Following this approach, PROs encompass subjective perceptions of patients on how they function or feel not only in relation to a health condition but also to its treatment as well as concepts such as health-related quality of life (HrQoL), information on the functional status of a patient, signs and symptoms and symptom burden. PRO measurement instruments (PROMs) are mostly questionnaires and inform about what patients can do and how they feel. PROs and PROMs have not yet found unconditional acceptance and wide use in the field of inborn errors of metabolism. This review summarises the importance and usefulness of PROs in research, drug legislation and clinical care and informs about quality standards, development, and potential methodological shortfalls of PROMs. Inclusion of PROs measured with high-quality, well-selected PROMs into clinical care, drug legislation, and research helps to identify unmet needs, improve quality of care, and define outcomes that are meaningful to patients. The field of IEM should open to new methodological approaches such as the definition of core sets of variables including PROs to be systematically assessed in specific metabolic conditions and new collaborations with PRO experts, such as psychologists to facilitate the systematic collection of meaningful data.
Why do some countries assign a major role to wind energy in decarbonizing their electricity systems, while others are much less committed to this technology? We argue that processes of (de-)legitimation, driven by discourse coalitions who strategically employ certain storylines in public debates, provide part of the answer. To illustrate our approach, we comparatively investigate public discourses surrounding wind energy in Austria and Switzerland, two countries that differ strongly in wind energy deployment. By combining a qualitative content analysis and a discourse network analysis of 808 newspaper articles published 2010–2020, we identify four distinct sets of storylines used to either delegitimize or legitimize the technology. Our study indicates that low deployment rates in Switzerland can be related to the prominence of delegitimizing storylines in the public discourse, which result in a rather low socio-political acceptance of wind energy. In Austria, by contrast, there is more consistent support for wind energy by discourse coalitions using a broad set of legitimizing storylines. By bridging the related but separate literatures of technology legitimacy and social acceptance, our study contributes to a better understanding of socio-political conflict and divergence in low-carbon technological pathways.
A step change is needed in the deployment of renewable energy if the triple challenge of ensuring climate change mitigation, energy security, and energy affordability is to be met. Yet, social acceptance of infrastructure projects and policies remains a key concern. While there has been decades of fruitful research on the social acceptance of wind energy and other renewables, much of the extant research is cross-sectional in nature, failing to capture the important dynamic processes that can make or break renewable energy projects. This paper introduces a Special Issue of Energy Policy which focuses on the neglected topic of the dynamics of social acceptance of renewable energy, drawing on contributions made at an international research conference held in St. Gallen (Switzerland) in June 2022. In addition to introducing these papers and drawing out common themes, we also seek to offer some conceptual clarity on the issue of dynamics in social acceptance, taking into account the influence of time, power, and scale in shaping decision-making processes. We conclude by highlighting a number of avenues of potential future research.
X-ray microtomography is a nondestructive, three-dimensional inspection technique applied across a vast range of fields and disciplines, ranging from research to industrial, encompassing engineering, biology, and medical research. Phasecontrast imaging extends the domain of application of x-ray microtomography to classes of samples that exhibit weak attenuation, thus appearing with poor contrast in standard x-ray imaging. Notable examples are low-atomic-number materials, like carbon-fiber composites, soft matter, and biological soft tissues.We report on a compact and cost-effective system for x-ray phase-contrast microtomography. The system features high sensitivity to phase gradients and high resolution, requires a low-power sealed x-ray tube, a single optical element, and fits in a small footprint. It is compatible with standard x-ray detector technologies: in our experiments, we have observed that single-photon counting offered higher angular sensitivity, whereas flat panels provided a larger field of view. The system is benchmarked against knownmaterial phantoms, and its potential for soft-tissue three-dimensional imaging is demonstrated on small-animal organs: a piglet esophagus and a rat heart.We believe that the simplicity of the setupwe are proposing, combined with its robustness and sensitivity, will facilitate accessing quantitative x-ray phase-contrast microtomography as a research tool across disciplines, including tissue engineering, materials science, and nondestructive testing in general.
Parametric anti-resonance is a phenomenon that occurs in systems with at least two degrees of freedom; this can be achieved by periodically exciting some parameters of the system. The effect of this properly tuned periodicity is to increase the dissipation in the system, which leads to a raising in the effective damping of vibrations. This contribution presents the design of an open-loop control to reduce the settling time using the anti-resonance concept. The control signal consists of a quasi-periodic signal capable of transferring the system’s oscillations from one mode to another mode of the system. The general averaging technique is used to characterize the dynamics, particularly the so-called slow dynamics of motion. With this analysis, the control signal is designed for the potential application of a microelectromechanical sensor arrangement; for this specific example, up to 96.8% reduction of settling time is achieved.
In this work, parametric excitation is introduced in a fully balanced flexible rotor mounted on two identical active gas foil bearings. The active gas foil bearings change the top foil shape harmonically with a specific amplitude and frequency. The deformable foil shape is approximated by an analytical function, while the gas pressure distribution is evaluated by the numerical solution of the Reynolds equation for compressible flow. The harmonic variation of the foil shape generates a respective variation in the bearings’ stiffness and damping properties and the system experiences parametric resonances and antiresonances in specific excitation frequencies. The nonlinear gas bearing forces generate bifurcations in the solutions of the system at certain rotating speeds and excitation frequencies; period doubling and Neimark-Sacker bifurcations are noticed in the examined system, and their progress is evaluated as the two bifurcation parameters (rotating speed and parametric excitation frequency) are changed, though a codimension-2 numerical continuation of limit cycles. It is found that at specific range of excitation frequency there are parametric anti-resonances and the bifurcations collide and vanish. Therefore, a bifurcation-free operating range is established and the system can operate stable at a wide speed range.
Digitalization is changing business models and operational processes. At the same time, improved data availability and powerful analytical methods are influencing controlling and increasingly require the use of statistical and information technology skills and knowledge. Using a case study from marketing controlling, the article shows the use of business analytics methods and addresses the tasks of controlling in the digital age.
By a simple femtosecond laser process, we fabricated metal-oxide/gold composite films for electrical and optical gas sensors. We designed a dripple wavelength AWG-spectrometer, matched to the plasma absorption wavelength region of the composite films. H2/CO absorptions fit well with the AWG design for multi gas detection sensor arrays
Pooled data from published reports on infants with clinically diagnosed vitamin B12 (B12) deficiency were analyzed with the purpose of describing the presentation, diagnostic approaches, and risk factors for the condition to inform prevention strategies. An electronic (PubMed database) and manual literature search following the PRISMA approach was conducted (preregistration with the Open Science Framework, accessed on 15 February 2023). Data were described and analyzed using correlation analyses, Chi-square tests, ANOVAs, and regression analyses, and 102 publications (292 cases) were analyzed. The mean age at first symptoms (anemia, various neurological symptoms) was four months; the mean time to diagnosis was 2.6 months. Maternal B12 at diagnosis, exclusive breastfeeding, and a maternal diet low in B12 predicted infant B12, methylmalonic acid, and total homocysteine. Infant B12 deficiency is still not easily diagnosed. Methylmalonic acid and total homocysteine are useful diagnostic parameters in addition to B12 levels. Since maternal B12 status predicts infant B12 status, it would probably be advantageous to target women in early pregnancy or even preconceptionally to prevent infant B12 deficiency, rather than to rely on newborn screening that often does not reliably identify high-risk children.
Grey Box models provide an important approach for control analysis in the Heating, Ventilation and Air Conditioning (HVAC) sector. Grey Box models consist of physical models where parameters are estimated from data. Due to the vast amount of component models that can be found in literature, the question arises, which component models perform best on a given system or dataset? This question is investigated systematically using a test case system with real operational data. The test case system consists of a HVAC system containing an energy recovery unit (ER), a heating coil (HC) and a cooling coil (CC). For each component, several suitable model variants from the literature are adapted appropriately and implemented. Four model variants are implemented for the ER and five model variants each for the HC and CC. Further, three global optimization algorithms and four local optimization algorithms to solve the nonlinear least squares system identification are implemented, leading to a total of 700 combinations. The comparison of all variants shows that the global optimization algorithms do not provide significantly better solutions. Their runtimes are significantly higher. Analysis of the models shows a dependency of the model accuracy on the number of total parameters.
Purpose – The purpose of this study is to explore the exogenous and endogenous drivers of the high-growth of Unicorn start-ups along their life cycle, with a particular focus on Unicorns in the fintech industry.
Design/methodology/approach – The study employs an explorative longitudinal analysis with a matched pair of two cases of Unicorns start-ups with similar antecedent features to understand holistically drivers over the longer term.
Findings – High-growth patterns over the longer term are the result of a combined industry- and company-life cycle perspective. Drivers and growth patterns vary significantly according to the time of entry in the industry and
its development status. The findings are systematised within a set of propositions to be tested in future research.
Research limitations/implications – The limitations lie in empirical evidence, as the analysis is limited to one matched-pair. The revealed Unicorns’ drivers for long-term growth might encourage future research to further investigate these drivers on a larger scale.
Practical implications – The study offers practical recommendations for start-ups with high-growth ambitions and advice to policy makers regarding the development of tailor-made support programs.
Originality/value – The study significantly extends extant work on growth and high-growth by examining endogenous and exogenous triggers over time and by linking the Unicorn-life cycle to the industry life cycle, an approach which has, to the best of the authors’ knowledge, not yet been applied.
International Entrepreneurship explains the opportunities and challenges facing internationalizing entrepreneurial ventures. The book inlcudes a thorough discussion of fundamentals as well as contemporary research findings. Numerous cases, featuring diverse contexts, illustrate theory and help classroom use.
The main aims of this work are the validation of the developed process of gluing a single-mode optical fiber array with a photonic chip and the selection of a more suitable adhesive from the two adhesives being compared. An active alignment system was used for adjusting the two optical fiber arrays to a photonics chip. The gluing was done by two compared UV curable adhesives applied in the optical path. The insertion losses of glued coupling were measured and investigated at two discrete wavelengths 1310 nm and 1550 nm during temperature testing in the climatic chamber according to Telcordia GR_1209_Corei04 [3]. The measurement, investigation, and comparison of insertion losses of the glued coupling at the spectral band from 1530 nm to 1570 nm were done immediately after gluing process and after three temperature cycles in the climatic chamber with one month delay.
In 2021, a prominent Austria dairy producer suffered from an IT attack and was completely paralysed. Without clearly defined mitigation measures in place, major disruptions were caused alongside the whole supply chain, including logistics service providers, governmental food safety bodies, as well as retailers (i.e., supermarkets and convenience stores). In this paper, we ask the question how digitisation and digital transformation impact IT security, especially when considering the complex company ecosystems of food production and food supply chains in Austria. The problem statement stems from a gap in knowledge of key differences in approaches towards IT security, resilience, risk management and especially business interfaces between food suppliers, supermarkets, distributors, logistics and other service providers. In order to answer related research questions, firstly, the authors conduct literature research, and highlight common guidelines and standardisation as well as look at state-based recommendations for critical infrastructure. In a second step, the paper describes a quantitative and qualitative survey with Austrian food companies (producers and retailers) which is described in detail in the paper. A description of recommended measures for the industry, further steps, as well as an outlook conclude the paper.
Background: Cardiovascular disease is the major cause of death worldwide. Although knowledge regarding diagnosing and treating cardiovascular disease has increased dramatically, secondary prevention remains insufficiently implemented due to failure among affected individuals to adhere to guideline recommendations. This has continued to lead to high morbidity and mortality rates. Involving patients in their healthcare and facilitating their active roles in their chronic disease management is an opportunity to meet the needs of the increasing number of cardio-vascular patients. However, simple recall of advice regarding a more preventive lifestyle does not affect sustainable behavioral lifestyle changes. We investigate the effect of plaque visualization combined with low-threshold daily lifestyle tasks using the smartphone app PreventiPlaque to evaluate change in cardiovascular risk profile. Methods: and study design: This randomized, controlled clinical trial includes 240 participants with ultrasound evidence of atherosclerotic plaque in one or both carotid arteries, defined as focal thickening of the vessel wall measuring 50% more than the regular vessel wall. A criterion for participation is access to a smartphone suitable for app usage. The participants are randomly assigned to an intervention or a control group. While both groups receive the standard of care, the intervention group has additional access to the PreventiPlaque app during the 12-month follow-up. The app includes daily tasks that promote a healthier lifestyle in the areas of smoking cessation, medication adherence, physical activity, and diet. The impact of plaque visualization and app use on the change in cardiovascular risk profile is assessed by SCORE2. Feasibility and effectiveness of the PreventiPlaque app are evaluated using standardized and validated measures for patient feedback.
The production of liquid-gas mixtures with desired properties still places high demands on process technology and is usually realized in bubble columns. The physical calculation models used have individual dimensionless factors which, depending on the application, are only valid for small ranges consisting of flow velocity, nozzle geometry and test setup. An iterative but time-consuming design of such dispersion processes is used in industry for producing a liquid-gas mixture according to desired requirements. In the present investigation, we accelerate the necessary design loops by setting up a physical model, which consists of several subsystems that are enriched by dedicated experiments to realize liquid-gas dispersions with low volume fraction and small air bubble diameters in oil. Our approach allows the extraction of individual dimensionless factors from maps of the introduced subsystems. These maps allow for targeted corrective measures of a production process for keeping the quality. The calculation-based approach avoids the need for performing iterative design loops. Overall, this approach supports the controlled generation of liquid-gas mixtures.
Creating a schedule to perform certain actions in a realworld environment typically involves multiple types of uncertainties. To create a plan which is robust towards uncertainties, it must stay flexible while attempting to be reliable and as close to optimal as possible. A plan is reliable if an adjustment to accommodate for a new requirement causes only a few disruptions. The system needs to be able to adapt to the schedule if unforeseen circumstances make planned actions impossible, or if an unlikely event would enable the system to follow a better path. To handle uncertainties, the used methods need to be dynamic and adaptive. The planning algorithms must be able to re-schedule planned actions and need to adapt the previously created plan to accommodate new requirements without causing critical disruptions to other required actions.
The usage of data gathered for Industry 4.0 and smart factory scenarios continues to be a problem for companies of all sizes. This is often the case because they aim to start with complicated and time-intensive Machine Learning scenarios. This work evaluates the Process Capability Analysis (PCA) as a pragmatic, easy and quick way of leveraging the gathered machine data from the production process. The area of application considered is injection molding. After describing all the required domain knowledge, the paper presents an approach for a continuous analysis of all parts produced. Applying PCA results in multiple key performance indicators that allow for fast and comprehensible process monitoring. The corresponding visualizations provide the quality department with a tool to efficiently choose where and when quality checks need to be performed. The presented case study indicates the benefit of analyzing whole process data instead of considering only selected production samples. The use of machine data enables additional insights to be drawn about process stability and the associated product quality.
Tap or swipe
(2023)
Demand-side management approaches that exploit the temporal flexibility of electric vehicles have attracted much attention in recent years due to the increasing market penetration. These demand-side management measures contribute to alleviating the burden on the power system, especially in distribution grids where bottlenecks are more prevalent. Electric vehicles can be defined as an attractive asset for distribution system operators, which have the potential to provide grid services if properly managed. In this thesis, first, a systematic investigation is conducted for two typically employed demand-side management methods reported in the literature: A voltage droop control-based approach and a market-driven approach. Then a control scheme of decentralized autonomous demand side management for electric vehicle charging scheduling which relies on a unidirectionally communicated grid-induced signal is proposed. In all the topics considered, the implications on the distribution grid operation are evaluated using a set of time series load flow simulations performed for representative Austrian distribution grids. Droop control mechanisms are discussed for electric vehicle charging control which requires no communication. The method provides an economically viable solution at all penetrations if electric vehicles charge at low nominal power rates. However, with the current market trends in residential charging equipment especially in the European context where most of the charging equipment is designed for 11 kW charging, the technical feasibility of the method, in the long run, is debatable. As electricity demand strongly correlates with energy prices, a linear optimization algorithm is proposed to minimize charging costs, which uses next-day market prices as the grid-induced incentive function under the assumption of perfect user predictions. The constraints on the state of charge guarantee the energy required for driving is delivered without failure. An average energy cost saving of 30% is realized at all penetrations. Nevertheless, the avalanche effect due to simultaneous charging during low price periods introduces new power peaks exceeding those of uncontrolled charging. This obstructs the grid-friendly integration of electric vehicles.
Alleviating the curse of dimensionality in minkowski sum approximations of storage flexibility
(2023)
Many real-world applications require the joint optimization of a large number of flexible devices over some time horizon. The flexibility of multiple batteries, thermostatically controlled loads, or electric vehicles, e.g., can be used to support grid operations and to reduce operation costs. Using piecewise constant power values, the flexibility of each device over d time periods can be described as a polytopic subset in power space. The aggregated flexibility is given by the Minkowski sum of these polytopes. As the computation of Minkowski sums is in general demanding, several approximations have been proposed in the literature. Yet, their application potential is often objective-dependent and limited by the curse of dimensionality. In this paper, we show that up to 2d vertices of each polytope can be computed efficiently and that the convex hull of their sums provides a computationally efficient inner approximation of the Minkowski sum. Via an extensive simulation study, we illustrate that our approach outperforms ten state-of-the-art inner approximations in terms of computational complexity and accuracy for different objectives. Moreover, we propose an efficient disaggregation method applicable to any vertex-based approximation. The proposed methods provide an efficient means to aggregate and to disaggregate typical battery storages in quarter-hourly periods over an entire day with reasonable accuracy for aggregated cost and for peak power optimization.
In the era of digital transformation an evolution takes place. Following this, new perspectives concerning leadership are required, especially in virtual teams. Shared Leadership is a promising leadership form to meet the challenges in a virtual team setting. Particularly, studies show that shared leadership increases performance, team creativity and innovative behavior. Moreover, the responsibility is distributed among several, not one individual. Nevertheless, it is unclear, which skills are needed in shared leadership teams and how they could be trained. Therefore, we develop a conceptual framework to pave the way for an empirical inquiry of the skills for and the role of shared leadership. Moreover, we encourage the discussion, whether the current leadership development is still viable and offer practical implications to develop shared leadership.
A model is presented that allows for the calculation of the success probability by which a vanilla Evolution Strategy converges to the global optimizer of the Rastrigin test function. As a result a population size scaling formula will be derived that allows for an estimation of the population size needed to ensure a high convergence security depending on the search space dimensionality.
Effective lead management
(2023)
In the last few years the global interest on lead management has increased. This classic topic for marketing and sales departments is aimed at converting potential customers into sales. The following thesis identifies the challenges and solutions for marketing and sales departments in order to process effective lead management. Using data from a literature review and qualitative empirical research, conducted with representatives of marketing and sales departments, the results showed overall and task specific challenges and solutions. The research indicates that overall challenges and solutions regarding the gap between marketing and sales, new processes and data management including data quality, software and silos emerge. In addition task specific challenges and solutions concerning lead generation including purchased leads, lead qualification, lead nurturing and sales specific challenges and solutions conclusively the focus on existing customers, time famine and lead routing were identified. This thesis provides a framework for further studies regarding the challenges and solutions for marketing and sales departments processing lead management.