Refine
Year of publication
Document Type
- Conference Proceeding (235)
- Article (214)
- Part of a Book (45)
- Master's Thesis (30)
- Book (16)
- Doctoral Thesis (8)
- Report (6)
- Periodical (3)
- Other (2)
- Part of Periodical (2)
Institute
- Forschungszentrum Mikrotechnik (181)
- Forschungszentrum Business Informatics (120)
- Department of Computer Science (108)
- Wirtschaft (88)
- Forschungszentrum Energie (54)
- Didaktik (37)
- Soziales und Gesundheit (24)
- Forschungszentrum Nutzerzentrierte Technologien (18)
- Department of Engineering (11)
- Forschungszentrum Sozial- und Wirtschaftswissenschaften (9)
Language
- English (564) (remove)
Keywords
- Laser ablation (9)
- Optimization (6)
- Y-branch splitter (6)
- arrayed waveguide gratings (6)
- integrated optics (6)
- Evolution strategy (5)
- Mathematical model (5)
- OCT (5)
- Volatile organic compounds (5)
- insertion loss (5)
In this work, we present a significant step toward in vivo ophthalmic optical coherence tomography and angiography on a photonic integrated chip. The diffraction gratings used in spectral-domain optical coherence tomography can be replaced by photonic integrated circuits comprising an arrayed waveguide grating. Two arrayed waveguide grating designs with 256 channels were tested, which enabled the first chip-based optical coherence tomography and angiography in vivo three-dimensional human retinal measurements. Design 1 supports a bandwidth of 22 nm, with which a sensitivity of up to 91 dB (830 µW) and an axial resolution of 10.7 µm was measured. Design 2 supports a bandwidth of 48 nm, with which a sensitivity of 90 dB (480 µW) and an axial resolution of 6.5 µm was measured. The silicon nitride-based integrated optical waveguides were fabricated with a fully CMOS-compatible process, which allows their monolithic co-integration on top of an optoelectronic silicon chip. As a benchmark for chip-based optical coherence tomography, tomograms generated by a commercially available clinical spectral-domain optical coherence tomography system were compared to those acquired with on-chip gratings. The similarities in the tomograms demonstrate the significant clinical potential for further integration of optical coherence tomography on a chip system.
With Cloud Computing and multi-core CPUs parallel computing resources are becoming more and more affordable and commonly available. Parallel programming should as well be easily accessible for everyone. Unfortunately, existing frameworks and systems are powerful but often very complex to use for anyone who lacks the knowledge about underlying concepts. This paper introduces a software framework and execution environment whose objective is to provide a system which should be easily usable for everyone who could benefit from parallel computing. Some real-world examples are presented with an explanation of all the steps that are necessary for computing in a parallel and distributed manner.
We present a new concept of 3D polymer-based 1 × 4 beam splitter for wavelength splitting around 1550 nm. The beam splitter consists of IP-Dip polymer as a core and polydimethylsiloxane (PDMS) Sylgard 184 as a cladding. The splitter was designed and simulated with two different photonics tools and the results show high splitting ratio for single-mode and multi-mode operation with low losses. Based on the simulations, a 3D beam splitter was designed and realized using direct laser writing (DLW) process with adaptation to coupling to standard single-mode fiber. With respect to the technological limits, the multi-mode splitter having core of (4 × 4) μm 2 was designed and fabricated together with supporting stable mechanical construction. Splitting properties were investigated by intensity monitoring of splitter outputs using optical microscopy and near-field scanning optical microscopy. In the development phase, the optical performance of fabricated beam splitter was examined by splitting of short visible wavelengths using red light emitting diode. Finally, the splitting of 1550 nm laser light was studied in detail by near-field measurements and compared with the simulated results. The nearly single-mode operation was observed and the shape of propagating mode and mode field diameter was well recognized.
Recently the use of microRNAs (miRNAs) as biomarkers for a multitude of diseases has gained substantial significance for clinical as well as point-of-care diagnostics. Amongst other challenges, however, it holds the central requirement that the concentration of a given miRNA must be evaluated within the context of other factors in order to unambiguously diagnose one specific disease. In terms of the development of diagnostic methods and devices, this implies an inevitable demand for multiplexing in order to be able to gauge the abundance of several components of interest in a patient’s sample in parallel. In this study, we design and implement different multiplexed versions of our electrochemical microfluidic biosensor by dividing its channel into subsections, creating four novel chip designs for the amplification-free and simultaneous quantification of up to eight miRNAs on the CRISPR-Biosensor X (‘X’ highlighting the multiplexing aspect of the device). We then use a one-step model assay followed by amperometric readout in combination with a 2-minute-stop-flow-protocol to explore the fluidic and mechanical characteristics and limitations of the different versions of the device. The sensor showing the best performance, is subsequently used for the Cas13a-powered proof-of-concept measurement of two miRNAs (miRNA-19b and miRNA-20a) from the miRNA-17∼92 cluster, which is dysregulated in the blood of pediatric medulloblastoma patients. Quantification of the latter, alongside simultaneous negative control measurements are accomplished on the same device. We thereby confirm the applicability of our platform to the challenge of amplification-free, parallel detection of multiple nucleic acids.
Mobility choices - an instrument for precise automatized travel behavior detection & analysis
(2021)
This study deals with the energy situation in Ny-Ålesund, an Arctic research station on the archipelago Svalbard, and aims at analysing the technical feasability of a transition to renewable energies by taking into consideration both the environmental and climatic impediments.
The analysis is based on a 27 year long collection of authentic meteorological data with all its strong fluctuations, seasonal as well as yearly. Great emphasis was put on the discussion of tried-and-tested renewable technologies that were compared to a new wind-based energy device that has yet to be tested for its reliability in the harsh environment of notably the Arctic winter. Meticulous calculations lead to the result that bifacial solar modules are an efficient means even in months when the sun stands low and their combination with wind-based devices prove to generate a maximum output. Geothermal energy seems to be promising in the region, but could not be evaluated due to a crucial lack of relevant data.
The study comes to the conclusion that the research station of Ny-Ålesund could well rely on a combination of renewable energy devices to cover its energy load, but needs to keep a back-up system of diesel run generators to bridge short periods of possible dysfunctions or standstills due to meteorological circumstances. Battery storage could only contribute to solve the problem of an unfortunate interruption of the energy supply, but it cannot serve as the entire back-up system since, at present, the need would go beyond all possible dimensions.
The importance of Agent-Based Simulation (ABS) as scientific method to generate data for scientific models in general and for informed policy decisions in particular has been widely recognised. However, the important technique of code testing of implementations like unit testing has not generated much research interested so far. As a possible solution, in previous work we have explored the conceptual use of property-based testing. In this code testing method, model specifications and invariants are expressed directly in code and tested through automated and randomised test data generation. This paper expands on our previous work and explores how to use property-based testing on a technical level to encode and test specifications of ABS. As use case the simple agent-based SIR model is used, where it is shown how to test agent behaviour, transition probabilities and model invariants. The outcome are specifications expressed directly in code, which relate whole classes of random input to expected classes of output. During test execution, random test data is generated automatically, potentially covering the equivalent of thousands of unit tests, run within seconds on modern hardware. This makes property-based testing in the context of ABS strictly more powerful than unit testing, as it is a much more natural fit due to its stochastic nature.
Over the last years, polymers have gained great attention as substrate material, because of the possibility to produce low-cost sensors in a high-throughput manner or for rapid prototyping and the wide variety of polymeric materials available with different features (like transparency, flexibility, stretchability, etc.). For almost all biosensing applications, the interaction between biomolecules (for example, antibodies, proteins or enzymes) and the employed substrate surface is highly important. In order to realize an effective biomolecule immobilization on polymers, different surface activation techniques, including chemical and physical methods, exist. Among them, plasma treatment offers an easy, fast and effective activation of the surfaces by micro/nanotexturing and generating functional groups (including carboxylic acids, amines, esters, aldehydes or hydroxyl groups). Hence, here we present a systematic and comprehensive plasma activation study of various polymeric surfaces by optimizing different parameters, including power, time, substrate temperature and gas composition. Thereby, the highest immobilization efficiency along with a homogenous biomolecule distribution is achieved with a 5-min plasma treatment under a gas composition of 50% oxygen and nitrogen, at a power of 1000 W and a substrate temperature of 80 C. These results are also confirmed by different surface characterization methods, including SEM, XPS and contact angle measurements.