Repository of University of Nova Gorica

Search the repository
A+ | A- | Help | SLO | ENG

Query: search in
search in
search in
search in
* old and bologna study programme

Options:
  Reset


1 - 10 / 107
First pagePrevious page12345678910Next pageLast page
1.
2.
Exhaled volatile organic compounds and respiratory disease : recent progress and future outlook
Maria Chiara Magnano, Waqar Ahmed, Ran Wang, Martina Bergant Marušič, Stephen J. Fowler, Iain R. White, 2024, review article

Abstract: The theoretical basis of eVOCs as biomarkers for respiratory disease diagnosis is described, followed by a review of the potential biomarkers that have been proposed as targets from in vitro studies. The utility of these targets is then discussed based on comparison with results from clinical breath studies. The current status of breath research is summarised for various diseases, with emphasis placed on quantitative and targeted studies. Potential for bias highlights several important concepts related to standardization, including practices adopted for compound identification, correction for background inspired VOC levels and computation of mixing ratios. The compiled results underline the need for targeted studies across different analytical platforms to understand how sampling and analytical factors impact eVOC quantification. The impact of environmental VOCs as confounders in breath analysis is discussed alongside the potential that eVOCs have as biomarkers of air pollution exposure and future perspectives on clinical breath sampling are provided.
Keywords: breath analysis, disease diagnosis, exhaled volatile organic compounds, respiratory disease, environmental exposure analysis, breath analysis
Published in RUNG: 06.05.2024; Views: 1276; Downloads: 9
.pdf Full text (1,36 MB)
This document has many files! More...

3.
Synthesis of helional by hydrodechlorination reaction in the presence of mono- and bimetallic catalysts supported on alumina
Oreste Piccolo, Iztok Arčon, Das Gangadhar, Giuliana Aquilanti, Andrea Prai, Stefano Paganelli, Manuela Facchin, Valentina Beghetto, 2024, original scientific article

Abstract: Hydrodechlorination reaction of 3-(benzo-1,3-dioxol-5-yl)-3-chloro-2-methylacrylaldehyde in the presence of different low metal content heterogeneous mono- or bimetallic catalysts was tested for the synthesis of the fragrance Helional® (3-[3,4-methylendioxyphenyl]-2-methyl-propionaldehyde). In particular, mono Pd/Al2O3, Rh/Al2O3 or bimetallic Pd-Cu/Al2O3, Rh-Cu/Al2O3 catalysts were tested in different reaction conditions from which it emerged that mono-Rh/Al2O3 was the best performing catalyst, allowing achievement of 100% substrate conversion and 99% selectivity towards Helional® in 24 h at 80 °C, p(H2) 1.0 MPa in the presence of a base. To establish correlations between atomic structure and catalytic activity, catalysts were characterized by Cu, Rh and Pd K-edge XANES, EXAFS analysis. These characterizations allowed verification that the formation of Pd-Cu alloys and the presence of Cu oxide/hydroxide species on the surface of the Al2O3 support are responsible for the very low catalytic efficiency of bimetallic species tested.
Keywords: selective hydrodechlorination, heterogeneous catalysis, XANES analysis, EXAFS analysis
Published in RUNG: 25.04.2024; Views: 1580; Downloads: 9
.pdf Full text (446,63 KB)
This document has many files! More...

4.
Complex network based Fourier analysis for signal processing
Vijayan Vijesh, K. Satheesh Kumar, Mohanachandran Nair Sindhu Swapna, Sankaranarayana Iyer Sankararaman, 2024, published scientific conference contribution

Keywords: fourier analysis, complex network, signal processing
Published in RUNG: 15.04.2024; Views: 1354; Downloads: 3
URL Link to file
This document has many files! More...

5.
AutoSourceID-Classifier : star-galaxy classification using a convolutional neural network with spatial information
F. Stoppa, Saptashwa Bhattacharyya, R. Ruiz de Austri, P. Vreeswijk, S. Caron, Gabrijela Zaharijas, S. Bloemen, G. Principe, D. Malyshev, Veronika Vodeb, 2023, original scientific article

Abstract: Aims: Traditional star-galaxy classification techniques often rely on feature estimation from catalogs, a process susceptible to introducing inaccuracies, thereby potentially jeopardizing the classification’s reliability. Certain galaxies, especially those not manifesting as extended sources, can be misclassified when their shape parameters and flux solely drive the inference. We aim to create a robust and accurate classification network for identifying stars and galaxies directly from astronomical images. Methods: The AutoSourceID-Classifier (ASID-C) algorithm developed for this work uses 32x32 pixel single filter band source cutouts generated by the previously developed AutoSourceID-Light (ASID-L) code. By leveraging convolutional neural networks (CNN) and additional information about the source position within the full-field image, ASID-C aims to accurately classify all stars and galaxies within a survey. Subsequently, we employed a modified Platt scaling calibration for the output of the CNN, ensuring that the derived probabilities were effectively calibrated, delivering precise and reliable results. Results: We show that ASID-C, trained on MeerLICHT telescope images and using the Dark Energy Camera Legacy Survey (DECaLS) morphological classification, is a robust classifier and outperforms similar codes such as SourceExtractor. To facilitate a rigorous comparison, we also trained an eXtreme Gradient Boosting (XGBoost) model on tabular features extracted by SourceExtractor. While this XGBoost model approaches ASID-C in performance metrics, it does not offer the computational efficiency and reduced error propagation inherent in ASID-C’s direct image-based classification approach. ASID-C excels in low signal-to-noise ratio and crowded scenarios, potentially aiding in transient host identification and advancing deep-sky astronomy.
Keywords: astronomical databases, data analysis, statistics, image processing
Published in RUNG: 12.12.2023; Views: 1475; Downloads: 6
.pdf Full text (10,31 MB)
This document has many files! More...

6.
Max-type reliability in uncertain post-disaster networks through the lens of sensitivity and stability analysis
Ahmad Hosseini, 2024, original scientific article

Abstract: The functionality of infrastructures, particularly in densely populated areas, is greatly impacted by natural disasters, resulting in uncertain networks. Thus, it is important for crisis management professionals and computer-based systems for transportation networks (such as expert systems) to utilize trustworthy data and robust computational methodologies when addressing convoluted decision-making predicaments concerning the design of transportation networks and optimal routes. This study aims to evaluate the vulnerability of paths in post-disaster transportation networks, with the aim of facilitating rescue operations and ensuring the safe delivery of supplies to affected regions. To investigate the problem of links' tolerances in uncertain networks and the resiliency and reliability of paths, an uncertainty theory-based model that employs minmax optimization with a bottleneck objective function is used. The model addresses the uncertain maximum reliable paths problem, which takes into account uncertain risk variables associated with links. Rather than using conventional methods for calculating the deterministic tolerances of a single element in combinatorial optimization, this study introduces a generalization of stability analysis based on tolerances while the perturbations in a group of links are involved. The analysis defines set tolerances that specify the minimum and maximum values that a designated group of links could simultaneously fluctuate while maintaining the optimality of the max-type reliable paths. The study shows that set tolerances can be considered as well-defined and proposes computational methods to calculate or bound such quantities - which were previously unresearched and difficult to measure. The model and methods are demonstrated to be both theoretically and numerically efficient by applying them to four subnetworks from our case study. In conclusion, this study provides a comprehensive approach to addressing uncertainty in reliability problems in networks, with potential applications in various fields.
Keywords: Disaster Management, Network Reliability, Stability Analysis, Transportation, Uncertainty
Published in RUNG: 24.11.2023; Views: 1441; Downloads: 6
URL Link to file
This document has many files! More...

7.
AutoSourceID-FeatureExtractor : optical image analysis using a two-step mean variance estimation network for feature estimation and uncertainty characterisation
F. Stoppa, R. Ruiz de Austri, P. Vreeswijk, Saptashwa Bhattacharyya, S. Caron, S. Bloemen, Gabrijela Zaharijas, G. Principe, Veronika Vodeb, P. J. Groot, E. Cator, G. Nelemans, 2023, original scientific article

Abstract: Aims: In astronomy, machine learning has been successful in various tasks such as source localisation, classification, anomaly detection, and segmentation. However, feature regression remains an area with room for improvement. We aim to design a network that can accurately estimate sources' features and their uncertainties from single-band image cutouts, given the approximated locations of the sources provided by the previously developed code AutoSourceID-Light (ASID-L) or other external catalogues. This work serves as a proof of concept, showing the potential of machine learning in estimating astronomical features when trained on meticulously crafted synthetic images and subsequently applied to real astronomical data. Methods: The algorithm presented here, AutoSourceID-FeatureExtractor (ASID-FE), uses single-band cutouts of 32x32 pixels around the localised sources to estimate flux, sub-pixel centre coordinates, and their uncertainties. ASID-FE employs a two-step mean variance estimation (TS-MVE) approach to first estimate the features and then their uncertainties without the need for additional information, for example the point spread function (PSF). For this proof of concept, we generated a synthetic dataset comprising only point sources directly derived from real images, ensuring a controlled yet authentic testing environment. Results: We show that ASID-FE, trained on synthetic images derived from the MeerLICHT telescope, can predict more accurate features with respect to similar codes such as SourceExtractor and that the two-step method can estimate well-calibrated uncertainties that are better behaved compared to similar methods that use deep ensembles of simple MVE networks. Finally, we evaluate the model on real images from the MeerLICHT telescope and the Zwicky Transient Facility (ZTF) to test its transfer learning abilities.
Keywords: data analysis, image processing, astronomical databases
Published in RUNG: 08.11.2023; Views: 1360; Downloads: 9
URL Link to file
This document has many files! More...

8.
Performance of the Cherenkov Telescope Array in the presence of clouds
Mario Pecimotika, Saptashwa Bhattacharyya, Barbara MARČUN, Judit Pérez Romero, Samo Stanič, Veronika Vodeb, Serguei Vorobiov, Gabrijela Zaharijas, Marko Zavrtanik, Danilo Zavrtanik, Miha Živec, 2021, published scientific conference contribution

Abstract: The Cherenkov Telescope Array (CTA) is the future ground-based observatory for gamma-ray astronomy at very high energies. The atmosphere is an integral part of every Cherenkov telescope. Di˙erent atmospheric conditions, such as clouds, can reduce the fraction of Cherenkov photons produced in air showers that reach ground-based telescopes, which may a˙ect the performance. Decreased sensitivity of the telescopes may lead to misconstructed energies and spectra. This study presents the impact of various atmospheric conditions on CTA performance. The atmospheric transmission in a cloudy atmosphere in the wavelength range from 203 nm to 1000 nm was simulated for di˙erent cloud bases and di˙erent optical depths using the MODerate resolution atmospheric TRANsmission (MODTRAN) code. MODTRAN output files were used as inputs for generic Monte Carlo simulations. The analysis was performed using the MAGIC Analysis and Reconstruction Software (MARS) adapted for CTA. As expected, the e˙ects of clouds are most evident at low energies, near the energy threshold. Even in the presence of dense clouds, high-energy gamma rays may still trigger the telescopes if the first interaction occurs lower in the atmosphere, below the cloud base. A method to analyze very high-energy data obtained in the presence of clouds is presented. The systematic uncertainties of the method are evaluated. These studies help to gain more precise knowledge about the CTA response to cloudy conditions and give insights on how to proceed with data obtained in such conditions. This may prove crucial for alert-based observations and time-critical studies of transient phenomena.
Keywords: Cherenkov Telescope Array, very-high energy gamma rays, MODerate resolution atmospheric TRANsmission code, MAGIC Analysis and Reconstruction Software
Published in RUNG: 18.09.2023; Views: 1363; Downloads: 5
.pdf Full text (980,51 KB)
This document has many files! More...

9.
Performance of a proposed event-type based analysis for the Cherenkov Telescope Array
Tarek Hassan, Saptashwa Bhattacharyya, Barbara MARČUN, Judit Pérez Romero, Samo Stanič, Veronika Vodeb, Serguei Vorobiov, Gabrijela Zaharijas, Marko Zavrtanik, Danilo Zavrtanik, Miha Živec, 2021, published scientific conference contribution

Abstract: The Cherenkov Telescope Array (CTA) will be the next-generation observatory in the field of very-high-energy (20 GeV to 300 TeV) gamma-ray astroparticle physics. Classically, data analysis in the field maximizes sensitivity by applying quality cuts on the data acquired. These cuts, optimized using Monte Carlo simulations, select higher quality events from the initial dataset. Subsequent steps of the analysis typically use the surviving events to calculate one set of instrument response functions (IRFs). An alternative approach is the use of event types, as implemented in experiments such as the Fermi-LAT. In this approach, events are divided into sub-samples based on their reconstruction quality, and a set of IRFs is calculated for each sub-sample. The sub-samples are then combined in a joint analysis, treating them as independent observations. This leads to an improvement in performance parameters such as sensitivity, angular and energy resolution. Data loss is reduced since lower quality events are included in the analysis as well, rather than discarded. In this study, machine learning methods will be used to classify events according to their expected angular reconstruction quality. We will report the impact on CTA high-level performance when applying such an event-type classification, compared to the classical procedure.
Keywords: Cherenkov Telescope Array, very-high-energy gamma-rays, event-type based analysis
Published in RUNG: 18.09.2023; Views: 1419; Downloads: 9
.pdf Full text (1,03 MB)
This document has many files! More...

10.
Search done in 0.05 sec.
Back to top