Refine
Year of publication
Document Type
- Conference Proceeding (235)
- Article (214)
- Part of a Book (45)
- Master's Thesis (30)
- Book (16)
- Doctoral Thesis (8)
- Report (6)
- Periodical (3)
- Other (2)
- Part of Periodical (2)
- Working Paper (2)
- Habilitation (1)
Institute
- Forschungszentrum Mikrotechnik (181)
- Forschungszentrum Business Informatics (120)
- Department of Computer Science (108)
- Wirtschaft (88)
- Forschungszentrum Energie (54)
- Didaktik (37)
- Soziales und Gesundheit (24)
- Forschungszentrum Nutzerzentrierte Technologien (18)
- Department of Engineering (11)
- Forschungszentrum Sozial- und Wirtschaftswissenschaften (9)
Language
- English (564) (remove)
Keywords
- Laser ablation (9)
- Optimization (6)
- Y-branch splitter (6)
- arrayed waveguide gratings (6)
- integrated optics (6)
- Evolution strategy (5)
- Mathematical model (5)
- OCT (5)
- Volatile organic compounds (5)
- insertion loss (5)
In this work, we present a significant step toward in vivo ophthalmic optical coherence tomography and angiography on a photonic integrated chip. The diffraction gratings used in spectral-domain optical coherence tomography can be replaced by photonic integrated circuits comprising an arrayed waveguide grating. Two arrayed waveguide grating designs with 256 channels were tested, which enabled the first chip-based optical coherence tomography and angiography in vivo three-dimensional human retinal measurements. Design 1 supports a bandwidth of 22 nm, with which a sensitivity of up to 91 dB (830 µW) and an axial resolution of 10.7 µm was measured. Design 2 supports a bandwidth of 48 nm, with which a sensitivity of 90 dB (480 µW) and an axial resolution of 6.5 µm was measured. The silicon nitride-based integrated optical waveguides were fabricated with a fully CMOS-compatible process, which allows their monolithic co-integration on top of an optoelectronic silicon chip. As a benchmark for chip-based optical coherence tomography, tomograms generated by a commercially available clinical spectral-domain optical coherence tomography system were compared to those acquired with on-chip gratings. The similarities in the tomograms demonstrate the significant clinical potential for further integration of optical coherence tomography on a chip system.
With Cloud Computing and multi-core CPUs parallel computing resources are becoming more and more affordable and commonly available. Parallel programming should as well be easily accessible for everyone. Unfortunately, existing frameworks and systems are powerful but often very complex to use for anyone who lacks the knowledge about underlying concepts. This paper introduces a software framework and execution environment whose objective is to provide a system which should be easily usable for everyone who could benefit from parallel computing. Some real-world examples are presented with an explanation of all the steps that are necessary for computing in a parallel and distributed manner.
We present a new concept of 3D polymer-based 1 × 4 beam splitter for wavelength splitting around 1550 nm. The beam splitter consists of IP-Dip polymer as a core and polydimethylsiloxane (PDMS) Sylgard 184 as a cladding. The splitter was designed and simulated with two different photonics tools and the results show high splitting ratio for single-mode and multi-mode operation with low losses. Based on the simulations, a 3D beam splitter was designed and realized using direct laser writing (DLW) process with adaptation to coupling to standard single-mode fiber. With respect to the technological limits, the multi-mode splitter having core of (4 × 4) μm 2 was designed and fabricated together with supporting stable mechanical construction. Splitting properties were investigated by intensity monitoring of splitter outputs using optical microscopy and near-field scanning optical microscopy. In the development phase, the optical performance of fabricated beam splitter was examined by splitting of short visible wavelengths using red light emitting diode. Finally, the splitting of 1550 nm laser light was studied in detail by near-field measurements and compared with the simulated results. The nearly single-mode operation was observed and the shape of propagating mode and mode field diameter was well recognized.
Recently the use of microRNAs (miRNAs) as biomarkers for a multitude of diseases has gained substantial significance for clinical as well as point-of-care diagnostics. Amongst other challenges, however, it holds the central requirement that the concentration of a given miRNA must be evaluated within the context of other factors in order to unambiguously diagnose one specific disease. In terms of the development of diagnostic methods and devices, this implies an inevitable demand for multiplexing in order to be able to gauge the abundance of several components of interest in a patient’s sample in parallel. In this study, we design and implement different multiplexed versions of our electrochemical microfluidic biosensor by dividing its channel into subsections, creating four novel chip designs for the amplification-free and simultaneous quantification of up to eight miRNAs on the CRISPR-Biosensor X (‘X’ highlighting the multiplexing aspect of the device). We then use a one-step model assay followed by amperometric readout in combination with a 2-minute-stop-flow-protocol to explore the fluidic and mechanical characteristics and limitations of the different versions of the device. The sensor showing the best performance, is subsequently used for the Cas13a-powered proof-of-concept measurement of two miRNAs (miRNA-19b and miRNA-20a) from the miRNA-17∼92 cluster, which is dysregulated in the blood of pediatric medulloblastoma patients. Quantification of the latter, alongside simultaneous negative control measurements are accomplished on the same device. We thereby confirm the applicability of our platform to the challenge of amplification-free, parallel detection of multiple nucleic acids.
Mobility choices - an instrument for precise automatized travel behavior detection & analysis
(2021)
This study deals with the energy situation in Ny-Ålesund, an Arctic research station on the archipelago Svalbard, and aims at analysing the technical feasability of a transition to renewable energies by taking into consideration both the environmental and climatic impediments.
The analysis is based on a 27 year long collection of authentic meteorological data with all its strong fluctuations, seasonal as well as yearly. Great emphasis was put on the discussion of tried-and-tested renewable technologies that were compared to a new wind-based energy device that has yet to be tested for its reliability in the harsh environment of notably the Arctic winter. Meticulous calculations lead to the result that bifacial solar modules are an efficient means even in months when the sun stands low and their combination with wind-based devices prove to generate a maximum output. Geothermal energy seems to be promising in the region, but could not be evaluated due to a crucial lack of relevant data.
The study comes to the conclusion that the research station of Ny-Ålesund could well rely on a combination of renewable energy devices to cover its energy load, but needs to keep a back-up system of diesel run generators to bridge short periods of possible dysfunctions or standstills due to meteorological circumstances. Battery storage could only contribute to solve the problem of an unfortunate interruption of the energy supply, but it cannot serve as the entire back-up system since, at present, the need would go beyond all possible dimensions.
The importance of Agent-Based Simulation (ABS) as scientific method to generate data for scientific models in general and for informed policy decisions in particular has been widely recognised. However, the important technique of code testing of implementations like unit testing has not generated much research interested so far. As a possible solution, in previous work we have explored the conceptual use of property-based testing. In this code testing method, model specifications and invariants are expressed directly in code and tested through automated and randomised test data generation. This paper expands on our previous work and explores how to use property-based testing on a technical level to encode and test specifications of ABS. As use case the simple agent-based SIR model is used, where it is shown how to test agent behaviour, transition probabilities and model invariants. The outcome are specifications expressed directly in code, which relate whole classes of random input to expected classes of output. During test execution, random test data is generated automatically, potentially covering the equivalent of thousands of unit tests, run within seconds on modern hardware. This makes property-based testing in the context of ABS strictly more powerful than unit testing, as it is a much more natural fit due to its stochastic nature.
Over the last years, polymers have gained great attention as substrate material, because of the possibility to produce low-cost sensors in a high-throughput manner or for rapid prototyping and the wide variety of polymeric materials available with different features (like transparency, flexibility, stretchability, etc.). For almost all biosensing applications, the interaction between biomolecules (for example, antibodies, proteins or enzymes) and the employed substrate surface is highly important. In order to realize an effective biomolecule immobilization on polymers, different surface activation techniques, including chemical and physical methods, exist. Among them, plasma treatment offers an easy, fast and effective activation of the surfaces by micro/nanotexturing and generating functional groups (including carboxylic acids, amines, esters, aldehydes or hydroxyl groups). Hence, here we present a systematic and comprehensive plasma activation study of various polymeric surfaces by optimizing different parameters, including power, time, substrate temperature and gas composition. Thereby, the highest immobilization efficiency along with a homogenous biomolecule distribution is achieved with a 5-min plasma treatment under a gas composition of 50% oxygen and nitrogen, at a power of 1000 W and a substrate temperature of 80 C. These results are also confirmed by different surface characterization methods, including SEM, XPS and contact angle measurements.
For a given set of banks, how big can losses in bad economic or financial scenarios possibly get, and what are these bad scenarios? These are the two central questions of stress tests for banks and the banking system. Current stress tests select stress scenarios in a way which might leave aside many dangerous scenarios and thus create an illusion of safety; and which might consider highly implausible scenarios and thus trigger a false alarm. We show how to select scenarios systematically for a banking system in a context of multiple credit exposures. We demonstrate the application of our method in an example on the Spanish and Italian residential real estate exposures of European banks. Compared to the EBA 2016 stress test our method produces scenarios which are equally plausible as the EBA stress scenario but yield considerably worse system wide losses.
Gas hydrates are usually synthesized by bringing together a pressurized gas and liquid or solid water. In both cases, the transport of gas or water to the hydrate growth site is hindered once an initial film of hydrate has grown at the water–gas interface. A seemingly forgotten gas-phase technique overcomes this problem by slowly depositing water vapor on a cold surface in the presence of the pressurized guest gas. Despite being used for the synthesis of low-formation-pressure hydrates, it has not yet been tested for hydrates of CO 2 and CH 4 . Moreover, the potential of the technique for the study of hydrate decomposition has not been recognized yet. We employ two advanced implementations of the condensation technique to form hydrates of CO 2 and CH 4 and demonstrate the applicability of the process for the study of hydrate decomposition and the phenomenon of self-preservation. Our results show that CO 2 and CH 4 hydrate samples deposited on graphite at 261–265 K are almost pure hydrates with an ice fraction of less than 8%. Rapid depressurization experiments with thin deposits (approx. 330 mm thickness) of CO 2 hydrate on an aluminum surface at 265 K yield identical dissociation curves when the deposition is done at identical pressure. However, hydrates deposited at 1 MPa almost completely withstand decomposition after rapid depressurization to 0.1 MPa, while samples deposited at 2 MPa decompose 7 times faster. Therefore, this synthesis technique is not only applicable for the study of hydrate decomposition but can also be used for the controlled deposition of a super-preserved hydrate.
The Digital Factory Vorarlberg is the youngest Research Center of Vorarlberg University of Applied Sciences. In the lab of the research center a research and learning factory has been established for educating students and employees of industrial partners. Showcases and best practice scenarios for various topics of digitalization in the manufacturing industry are demonstrated. In addition, novel methods and technologies for digital production, cloud-based manufacturing, data analytics, IT- and OT-security or digital twins are being developed. The factory comprises only a minimum core of logistics and fabrication processes to guarantee manageability within an academic setup. As a product, fidget spinners are being fabricated. A webshop allows customers to individually design their products and directly place orders in the factory. A centralized SCADA-System is the core data hub for the factory. Various data analytic tools and methods and a novel database for IoT-applications are connected to the SCADA-System. As an alternative to on premise manufacturing, orders can be pushed into a cloud-based manufacturing platform, which has been developed at the Digital Factory. A broker system allows fabrication in distributed facilities and offers various optimization services. Concepts, such as outsourcing product configuration to customers or new types of engineering services in cloud-based manufacturing can be explored and demonstrated. In this paper, we present the basic concept of the Digital Factory Vorarlberg, as well as some of the newly developed topics.
Real-time measurements of the differences in inhaled and exhaled, unlabeled and fully deuterated acetone concentration levels, at rest and during exercise, have been conducted using proton transfer reaction mass spectrometry. A novel approach to continuously differentiate between the inhaled and exhaled breath acetone concentration signals is used. This leads to unprecedented fine grained data of inhaled and exhaled concentrations. The experimental results obtained are compared with those predicted using a simple three compartment model that theoretically describes the influence of inhaled concentrations on exhaled breath concentrations for volatile organic compounds with high blood:air partition coefficients, and hence is appropriate for acetone. An agreement between the predicted and observed concentrations is obtained. Our results highlight that the influence of the upper airways cannot be neglected for volatiles with high blood:air partition coefficients, i.e. highly water soluble volatiles.
Clathrate hydrates, or hydrates for short, are inclusion compounds in which water molecules form a hydrogen-bonded host lattice that accommodates the guest molecules. While vast amounts of hydrates are known to exist in seafloor sediments and in the permafrost on Earth, these occurrences might be dwarfed by the amounts of hydrates occurring in space and on celestial bodies. Since methane is the primary guest molecule in most of the natural occurrences on Earth, hydrates are considered a promising source of energy. Moreover, the ability of one volume of hydrate to store about 170 volumes of gas, make hydrates a promising functional material for various industrial applications. While the static properties of hydrates are reasonably well known, the dynamics of hydrate formation and decomposition are insufficiently understood. For instance, the stochastic period of hydrate nucleation, the memory effect, and the self-preservation phenomenon complicate the development of predictive models of hydrate dynamics. Additionally, the influence of meso- and macroscopic defects as well as the roles of mass and heat transport on different length scales remain to be clarified.
Due to its non-invasive and non-destructive nature and the high spatial resolution of approx. 1µm or even less, micro-computed X-ray attenuation tomography ( µCT ) seems to be the perfect method for the study of the evolving structures of forming or decomposing hydrates on the meso- and macroscopic length scale. However, for the naturally occurring hydrates of low atomic number guests the contrast between hydrate, ice, and liquid water is typically very weak because of similar X-ray attenuation coefficients. So far, good contrast was only restricted to synchrotron beamline experiments which utilize the phase information of monochromatic X-rays.
In this thesis it is shown that with the help of a newly developed sample cell, a contrast between the hydrate and the ice phase sufficiently good for the reliable segmentation of the materials can also be achieved in conventional tube-based µCT. An accurate pressure and temperature management, i.e., the added functionality of the cell, further allows for cross-correlation of structural and thermodynamic data. The capability of this µCT setup is demonstrated in a series of studies on the formation and decomposition of hydrates which yield new insights for the development of a novel route to hydrate synthesis. At last, this thesis points towards possibilities how better models of hydrate formation and decomposition can be developed with the aid of µCT and computer simulations.
Investigation of non-uniformly emitting optical fiber diffusers on the light distribution in tissue
(2020)
The humidification-dehumidification process (HDH) for desalination is a promising technology to address water scarcity issues in rural regions. However, a low humidifier efficiency is a weakness of the process. Bubble column humidifiers (BCH) are promising for HDH, as they provide enhanced heat and mass transfer and have low maintenance requirements. Previous studies of HDH-systems with BCHs draw different conclusions regarding the impact of superficial air velocity and liquid height on the humidification. Furthermore, the impact of flow characteristics has never been investigated systematically at all. In this study, an optimized BCH test setup that allows for optical analysis of the humidifier is used and evaluated. Our test setup is validated, since the influence of water temperature on the humidification, which is exponential, is reproduced. Measurements with seawater show that the normalised system productivity is increased by about 56 % with an increase in superficial air velocity from 0.5 to 5 cm/s. Furthermore, the system productivity is increased by around 29 % with an increase in liquid height from 60 to 378 mm. While the impact of superficial air velocity can be traced back to temperature changes at the humidifier and dehumidifier outlets, the impact of liquid height is shown to be caused by a smaller heat loss surface in the humidifier with an increase in liquid height. For the impact of sieve plate orifice diameter, a clear influence on the humidification is not apparent, this parameter needs to be investigated further. Finally, our new test setup allows for analysing the humidification of air (1) in a systematic way, (2) in relevant measurement ranges and (3) in comparison with optical analyses of the flow characteristics.