Refine
Year of publication
Document Type
- Article (58)
- Conference Proceeding (19)
- Part of a Book (2)
- Working Paper (2)
- Book (1)
- Doctoral Thesis (1)
- Other (1)
Institute
- Forschungszentrum Mikrotechnik (25)
- Forschungszentrum Business Informatics (20)
- Forschungszentrum Energie (18)
- Technik | Engineering & Technology (12)
- Department of Computer Science (Ende 2021 aufgelöst; Integration in die übergeordnete OE Technik) (11)
- Forschungszentrum Human Centred Technologies (8)
- Soziales & Gesundheit (6)
- Forschungszentrum Digital Factory Vorarlberg (4)
- Josef Ressel Zentrum für Intelligente Thermische Energiesysteme (4)
- Forschung (2)
Language
- English (84) (remove)
Has Fulltext
- yes (84) (remove)
Is part of the Bibliography
- yes (84) (remove)
Keywords
- Global optimization (3)
- Peripheral arterial disease (3)
- Rastrigin function (3)
- Y-branch splitter (3)
- Bubble column humidifier (2)
- Cloud manufacturing (2)
- Demand response (2)
- Demand side management (2)
- Desalination (2)
- Distribution grids (2)
The utilization of lasers in dentistry expands greatly in recent years. For instance, fs-lasers are effective for both drilling and caries prevention, while cw-lasers are useful for adhesive hardening. A cutting-edge application of lasers in dentistry is the debonding of veneers. While there are pre-existing tools for this purpose, there is still potential for improvement. Initial efforts to investigate laser assisted debonding mechanisms with measurements of the optical and mechanical properties of teeth and prosthetic ceramics are presented. Preliminary tests conducted with a laser system used for debonding that is commercially available showed differences in the output power set at the systems console to that at specified distances from the handpiece. Furthermore, the optical properties of the samples (human teeth and ceramics) were characterised. The optical properties of the ceramics should closely resemble those of teeth in terms of look and feel, but they also influence the laser assisted debonding technique and thus must be taken into account. In addition first attempts were performed to investigate the mechanical properties of the samples by means of pump-probe-elastography under a microscope. By analyzing the sample surface up to 20 ns after a fs-laser pulse impact, pressure and shock waves could be detected, which can be utilized to determine the elastic constants of specific materials. Together such investigations are needed to shape the basis for a purely optical approach of debonding of veneers utilizing acoustic waves.
Power plant operators increasingly rely on predictive models to diagnose and monitor their systems. Data-driven prediction models are generally simple and can have high precision, making them superior to physics-based or knowledge-based models, especially for complex systems like thermal power plants. However, the accuracy of data-driven predictions depends on (1) the quality of the dataset, (2) a suitable selection of sensor signals, and (3) an appropriate selection of the training period. In some instances, redundancies and irrelevant sensors may even reduce the prediction quality.
We investigate ideal configurations for predicting the live steam production of a solid fuel-burning thermal power plant in the pulp and paper industry for different modes of operation. To this end, we benchmark four machine learning algorithms on two feature sets and two training sets to predict steam production. Our results indicate that with the best possible configuration, a coefficient of determination of R^2 = 0.95 and a mean absolute error of MAE=1.2 t/h with an average steam production of 35.1 t/h is reached. On average, using a dynamic dataset for training lowers MAE by 32% compared to a static dataset for training. A feature set based on expert knowledge lowers MAE by an additional 32 %, compared to a simple feature set representing the fuel inputs. We can conclude that based on the static training set and the basic feature set, machine learning algorithms can identify long-term changes. When using a dynamic dataset the performance parameters of thermal power plants are predicted with high accuracy and allow for detecting short-term problems.
Highly-sensitive single-step sensing of levodopa by swellable microneedle-mounted nanogap sensors
(2023)
Microneedle (MN) sensing of biomarkers in interstitial fluid (ISF) can overcome the challenges of self-diagnosis of diseases by a patient, such as blood sampling, handling, and measurement analysis. However, the MN sensing technologies still suffer from poor measurement accuracy due to the small amount of target molecules present in ISF, and require multiple steps of ISF extraction, ISF isolation from MN, and measurement with additional equipment. Here, we present a swellable MN-mounted nanogap sensor that can be inserted into the skin tissue, absorb ISF rapidly, and measure biomarkers in situ by amplifying the measurement signals by redox cycling in nanogap electrodes. We demonstrate that the MN-nanogap sensor measures levodopa (LDA), medication for Parkinson disease, down to 100 nM in an aqueous solution, and 1 μM in both the skin-mimicked gelatin phantom and porcine skin.
Organic acidurias (OAs), urea-cycle disorders (UCDs), and maple syrup urine disease (MSUD) belong to the category of intoxication-type inborn errors of metabolism (IT-IEM). Liver transplantation (LTx) is increasingly utilized in IT-IEM. However, its impact has been mainly focused on clinical outcome measures and rarely on health-related quality of life (HRQoL). Aim of the study was to investigate the impact of LTx on HrQoL in IT-IEMs. This single center prospective study involved 32 patients (15 OA, 11 UCD, 6 MSUD; median age at LTx 3.0 years, range 0.8–26.0). HRQoL was assessed pre/post transplantation by PedsQL-General Module 4.0 and by MetabQoL 1.0, a specifically designed tool for IT-IEM. PedsQL highlighted significant post-LTx improvements in total and physical functioning in both patients' and parents' scores. According to age at transplantation (≤3 vs. >3 years), younger patients showed higher post-LTx scores on Physical (p = 0.03), Social (p < 0.001), and Total (p =0.007) functioning. MetabQoL confirmed significant post-LTx changes in Total and Physical functioning in both patients and parents scores (p ≤ 0.009). Differently from PedsQL, MetabQoL Mental (patients p = 0.013, parents p = 0.03) and Social scores (patients p = 0.02, parents p = 0.012) were significantly higher post-LTx. Significant improvements (p = 0.001–0.04) were also detected both in self- and proxy-reports for almost all MetabQoL subscales. This study shows the importance of assessing the impact of transplantation on HrQoL, a meaningful outcome reflecting patients' wellbeing. LTx is associated with significant improvements of HrQol in both self- and parentreports. The comparison between PedsQL-GM and MetabQoL highlighted that MetabQoL demonstrated higher sensitivity in the assessment of diseasespecific domains than the generic PedsQL tool.
Long-Term outcome of infantile onset pompe disease patients treated with enzyme replacement therapy
(2024)
Background: Enzyme replacement therapy (ERT) with recombinant human alglucosidase alfa (rhGAA) was approved in Europe in 2006. Nevertheless, data on the long-term outcome of infantile onset Pompe disease (IOPD) patients at school age is still limited.
Objective: We analyzed in detail cardiac, respiratory, motor, and cognitive function of 15 German-speaking patients aged 7 and older who started ERT at a median age of 5 months.
Results: Starting dose was 20 mg/kg biweekly in 12 patients, 20 mg/kg weekly in 2, and 40 mg/kg weekly in one patient. CRIM-status was positive in 13 patients (86.7%) and negative or unknown in one patient each (6.7%). Three patients (20%) received immunomodulation. Median age at last assessment was 9.1 (7.0–19.5) years. At last follow-up 1 patient (6.7%) had mild cardiac hypertrophy, 6 (42.9%) had cardiac arrhythmias, and 7 (46.7%) required assisted ventilation. Seven patients (46.7%) achieved the ability to walk independently and 5 (33.3%) were still ambulatory at last follow-up. Six patients (40%) were able to sit without support, while the remaining 4 (26.7%) were tetraplegic. Eleven patients underwent cognitive testing (Culture Fair Intelligence Test), while 4 were unable to meet the requirements for cognitive testing. Intelligence quotients (IQs) ranged from normal (IQ 117, 102, 96, 94) in 4 patients (36.4%) to mild developmental delay (IQ 81) in one patient (9.1%) to intellectual disability (IQ 69, 63, 61, 3x < 55) in 6 patients (54.5%). White matter abnormalities were present in 10 out of 12 cerebral MRIs from 7 patients.
Measuring what matters
(2023)
Patient reported outcomes (PROs) are generally defined as ‘any report of the status of a patient's health condition that comes directly from the patient, without interpretation of the patient's response by a clinician or anyone else’. A broader definition of PRO also includes ‘any information on the outcomes of health care obtained directly from patients without modification by clinicians or other health care professionals’. Following this approach, PROs encompass subjective perceptions of patients on how they function or feel not only in relation to a health condition but also to its treatment as well as concepts such as health-related quality of life (HrQoL), information on the functional status of a patient, signs and symptoms and symptom burden. PRO measurement instruments (PROMs) are mostly questionnaires and inform about what patients can do and how they feel. PROs and PROMs have not yet found unconditional acceptance and wide use in the field of inborn errors of metabolism. This review summarises the importance and usefulness of PROs in research, drug legislation and clinical care and informs about quality standards, development, and potential methodological shortfalls of PROMs. Inclusion of PROs measured with high-quality, well-selected PROMs into clinical care, drug legislation, and research helps to identify unmet needs, improve quality of care, and define outcomes that are meaningful to patients. The field of IEM should open to new methodological approaches such as the definition of core sets of variables including PROs to be systematically assessed in specific metabolic conditions and new collaborations with PRO experts, such as psychologists to facilitate the systematic collection of meaningful data.
X-ray microtomography is a nondestructive, three-dimensional inspection technique applied across a vast range of fields and disciplines, ranging from research to industrial, encompassing engineering, biology, and medical research. Phasecontrast imaging extends the domain of application of x-ray microtomography to classes of samples that exhibit weak attenuation, thus appearing with poor contrast in standard x-ray imaging. Notable examples are low-atomic-number materials, like carbon-fiber composites, soft matter, and biological soft tissues.We report on a compact and cost-effective system for x-ray phase-contrast microtomography. The system features high sensitivity to phase gradients and high resolution, requires a low-power sealed x-ray tube, a single optical element, and fits in a small footprint. It is compatible with standard x-ray detector technologies: in our experiments, we have observed that single-photon counting offered higher angular sensitivity, whereas flat panels provided a larger field of view. The system is benchmarked against knownmaterial phantoms, and its potential for soft-tissue three-dimensional imaging is demonstrated on small-animal organs: a piglet esophagus and a rat heart.We believe that the simplicity of the setupwe are proposing, combined with its robustness and sensitivity, will facilitate accessing quantitative x-ray phase-contrast microtomography as a research tool across disciplines, including tissue engineering, materials science, and nondestructive testing in general.
Pooled data from published reports on infants with clinically diagnosed vitamin B12 (B12) deficiency were analyzed with the purpose of describing the presentation, diagnostic approaches, and risk factors for the condition to inform prevention strategies. An electronic (PubMed database) and manual literature search following the PRISMA approach was conducted (preregistration with the Open Science Framework, accessed on 15 February 2023). Data were described and analyzed using correlation analyses, Chi-square tests, ANOVAs, and regression analyses, and 102 publications (292 cases) were analyzed. The mean age at first symptoms (anemia, various neurological symptoms) was four months; the mean time to diagnosis was 2.6 months. Maternal B12 at diagnosis, exclusive breastfeeding, and a maternal diet low in B12 predicted infant B12, methylmalonic acid, and total homocysteine. Infant B12 deficiency is still not easily diagnosed. Methylmalonic acid and total homocysteine are useful diagnostic parameters in addition to B12 levels. Since maternal B12 status predicts infant B12 status, it would probably be advantageous to target women in early pregnancy or even preconceptionally to prevent infant B12 deficiency, rather than to rely on newborn screening that often does not reliably identify high-risk children.
Grey Box models provide an important approach for control analysis in the Heating, Ventilation and Air Conditioning (HVAC) sector. Grey Box models consist of physical models where parameters are estimated from data. Due to the vast amount of component models that can be found in literature, the question arises, which component models perform best on a given system or dataset? This question is investigated systematically using a test case system with real operational data. The test case system consists of a HVAC system containing an energy recovery unit (ER), a heating coil (HC) and a cooling coil (CC). For each component, several suitable model variants from the literature are adapted appropriately and implemented. Four model variants are implemented for the ER and five model variants each for the HC and CC. Further, three global optimization algorithms and four local optimization algorithms to solve the nonlinear least squares system identification are implemented, leading to a total of 700 combinations. The comparison of all variants shows that the global optimization algorithms do not provide significantly better solutions. Their runtimes are significantly higher. Analysis of the models shows a dependency of the model accuracy on the number of total parameters.
The production of liquid-gas mixtures with desired properties still places high demands on process technology and is usually realized in bubble columns. The physical calculation models used have individual dimensionless factors which, depending on the application, are only valid for small ranges consisting of flow velocity, nozzle geometry and test setup. An iterative but time-consuming design of such dispersion processes is used in industry for producing a liquid-gas mixture according to desired requirements. In the present investigation, we accelerate the necessary design loops by setting up a physical model, which consists of several subsystems that are enriched by dedicated experiments to realize liquid-gas dispersions with low volume fraction and small air bubble diameters in oil. Our approach allows the extraction of individual dimensionless factors from maps of the introduced subsystems. These maps allow for targeted corrective measures of a production process for keeping the quality. The calculation-based approach avoids the need for performing iterative design loops. Overall, this approach supports the controlled generation of liquid-gas mixtures.
Creating a schedule to perform certain actions in a realworld environment typically involves multiple types of uncertainties. To create a plan which is robust towards uncertainties, it must stay flexible while attempting to be reliable and as close to optimal as possible. A plan is reliable if an adjustment to accommodate for a new requirement causes only a few disruptions. The system needs to be able to adapt to the schedule if unforeseen circumstances make planned actions impossible, or if an unlikely event would enable the system to follow a better path. To handle uncertainties, the used methods need to be dynamic and adaptive. The planning algorithms must be able to re-schedule planned actions and need to adapt the previously created plan to accommodate new requirements without causing critical disruptions to other required actions.
A model is presented that allows for the calculation of the success probability by which a vanilla Evolution Strategy converges to the global optimizer of the Rastrigin test function. As a result a population size scaling formula will be derived that allows for an estimation of the population size needed to ensure a high convergence security depending on the search space dimensionality.
Activation of heat pump flexibilities is a viable solution to support balancing the grid via Demand Side Management measures and fulfill the need for flexibility options. Aggregators as interface between prosumers, distribution system operators and balance responsible parties face the challenge due to data privacy and technical restrictions to transform prosumer information into aggregated available flexibility to enable trading thereof. Thereby, literature lacks a generic, applicable and widely accepted flexibility estimation method for heat pumps,which incorporates reduced sensor and system information, system- and demand-dependent behaviour. In this paper, we adapt and extend a method from literature, by incorporating domain knowledge to overcome reduced sensor and system information. We apply data of five real-world heat pump systems, distinguish operation modes, estimate power and energy flexibility of each single heat pump system, proof transferability of the method, and aggregate the flexibilities available to showcase a small HP pool as a proof of concept.
Open tracing tools
(2023)
Background: Coping with the rapid growing complexity in contemporary software architecture, tracing has become an increasingly critical practice and been adopted widely by software engineers. By adopting tracing tools, practitioners are able to monitor, debug, and optimize distributed software architectures easily. However, with excessive number of valid candidates, researchers and practitioners have a hard time finding and selecting the suitable tracing tools by systematically considering their features and advantages. Objective: To such a purpose, this paper aims to provide an overview of popular Open tracing tools via comparison. Methods: Herein, we first identified 30 tools in an objective, systematic, and reproducible manner adopting the Systematic Multivocal Literature Review protocol. Then, we characterized each tool looking at the 1) measured features, 2) popularity both in peer-reviewed literature and online media, and 3) benefits and issues. We used topic modeling and sentiment analysis to extract and summarize the benefits and issues. Specially, we adopted ChatGPT to support the topic interpretation. Results: As a result, this paper presents a systematic comparison amongst the selected tracing tools in terms of their features, popularity, benefits and issues. Conclusion: The result mainly shows that each tracing tool provides a unique combination of features with also different pros and cons. The contribution of this paper is to provide the practitioners better understanding of the tracing tools facilitating their adoption.
Immersive educational spaces
(2023)
"If only we had had such opportunities to grasp history like this when I was young" – words by an almost 80-year-old woman holding an iPad on which both, the buildings in the background and a tower in the form of a virtual 3D object, appear within reach. To "grasp" history - what an apt use of this action-oriented word for an augmented reality application built on considerations of thinking and acting in history. This telling image emerged during the first test run of the app i.appear which will be the focus of this article's considerations on the use of immersive learning environments. The application i.appear has been used in the city of Dornbirn (Austria) for a year now to teach historical content through location-based augmented reality and other interactive and multimedia technologies. After a brief description of the potential of such applications, the epistemological structure of the hosting app i.appear and its functionality will be outlined. This article will focus on the “Baroque Master Builders” tour of the hosting app that was created and tested as part of the current research.