958 resultados para Computer-simulations
Resumo:
A BASIC computer program (REMOVAL) was developed to compute in a VAXNMS environment all the calculations of the removal method for population size estimation (catch-effort method for closed populations with constant sampling effort). The program follows the maximum likelihood methodology,checks the failure conditions, applies the appropriate formula, and displays the estimates of population size and catchability, with their standard deviations and coefficients of variation, and two goodness-of-fit statistics with their significance levels. Data of removal experiments for the cyprinodontid fish Aphanius iberus in the Alt Emporda wetlands are used to exemplify the use of the program
Resumo:
In order to develop applications for z;isual interpretation of medical images, the early detection and evaluation of microcalcifications in digital mammograms is verg important since their presence is oftenassociated with a high incidence of breast cancers. Accurate classification into benign and malignant groups would help improve diagnostic sensitivity as well as reduce the number of unnecessa y biopsies. The challenge here is the selection of the useful features to distinguish benign from malignant micro calcifications. Our purpose in this work is to analyse a microcalcification evaluation method based on a set of shapebased features extracted from the digitised mammography. The segmentation of the microcalcificationsis performed using a fixed-tolerance region growing method to extract boundaries of calcifications with manually selected seed pixels. Taking into account that shapes and sizes of clustered microcalcificationshave been associated with a high risk of carcinoma based on digerent subjective measures, such as whether or not the calcifications are irregular, linear, vermiform, branched, rounded or ring like, our efforts were addressed to obtain a feature set related to the shape. The identification of the pammeters concerning the malignant character of the microcalcifications was performed on a set of 146 mammograms with their real diagnosis known in advance from biopsies. This allowed identifying the following shape-based parameters as the relevant ones: Number of clusters, Number of holes, Area, Feret elongation, Roughness, and Elongation. Further experiments on a set of 70 new mammogmms showed that the performance of the classification scheme is close to the mean performance of three expert radiologists, which allows to consider the proposed method for assisting the diagnosis and encourages to continue the investigation in the senseof adding new features not only related to the shape
Resumo:
We report a Lattice-Boltzmann scheme that accounts for adsorption and desorption in the calculation of mesoscale dynamical properties of tracers in media of arbitrary complexity. Lattice Boltzmann simulations made it possible to solve numerically the coupled Navier-Stokes equations of fluid dynamics and Nernst-Planck equations of electrokinetics in complex, heterogeneous media. With the moment propagation scheme, it became possible to extract the effective diffusion and dispersion coefficients of tracers, or solutes, of any charge, e.g., in porous media. Nevertheless, the dynamical properties of tracers depend on the tracer-surface affinity, which is not purely electrostatic and also includes a species-specific contribution. In order to capture this important feature, we introduce specific adsorption and desorption processes in a lattice Boltzmann scheme through a modified moment propagation algorithm, in which tracers may adsorb and desorb from surfaces through kinetic reaction rates. The method is validated on exact results for pure diffusion and diffusion-advection in Poiseuille flows in a simple geometry. We finally illustrate the importance of taking such processes into account in the time-dependent diffusion coefficient in a more complex porous medium.
Resumo:
Network virtualisation is considerably gaining attentionas a solution to ossification of the Internet. However, thesuccess of network virtualisation will depend in part on how efficientlythe virtual networks utilise substrate network resources.In this paper, we propose a machine learning-based approachto virtual network resource management. We propose to modelthe substrate network as a decentralised system and introducea learning algorithm in each substrate node and substrate link,providing self-organization capabilities. We propose a multiagentlearning algorithm that carries out the substrate network resourcemanagement in a coordinated and decentralised way. The taskof these agents is to use evaluative feedback to learn an optimalpolicy so as to dynamically allocate network resources to virtualnodes and links. The agents ensure that while the virtual networkshave the resources they need at any given time, only the requiredresources are reserved for this purpose. Simulations show thatour dynamic approach significantly improves the virtual networkacceptance ratio and the maximum number of accepted virtualnetwork requests at any time while ensuring that virtual networkquality of service requirements such as packet drop rate andvirtual link delay are not affected.
Resumo:
The purpose of gamma spectrometry and gamma and X-ray tomography of nuclear fuel is to determine both radionuclide concentration and integrity and deformation of nuclear fuel. The aims of this thesis have been to find out the basics of gamma spectrometry and tomography of nuclear fuel, to find out the operational mechanisms of gamma spectrometry and tomography equipment of nuclear fuel, and to identify problems that relate to these measurement techniques. In gamma spectrometry of nuclear fuel the gamma-ray flux emitted from unstable isotopes is measured using high-resolution gamma-ray spectroscopy. The production of unstable isotopes correlates with various physical fuel parameters. In gamma emission tomography the gamma-ray spectrum of irradiated nuclear fuel is recorded for several projections. In X-ray transmission tomography of nuclear fuel a radiation source emits a beam and the intensity, attenuated by the nuclear fuel, is registered by the detectors placed opposite. When gamma emission or X-ray transmission measurements are combined with tomographic image reconstruction methods, it is possible to create sectional images of the interior of nuclear fuel. MODHERATO is a computer code that simulates the operation of radioscopic or tomographic devices and it is used to predict and optimise the performance of imaging systems. Related to the X-ray tomography, MODHERATO simulations have been performed by the author. Gamma spectrometry and gamma and X-ray tomography are promising non-destructive examination methods for understanding fuel behaviour under normal, transient and accident conditions.
Resumo:
Large Hadron Collider (LHC) is the main particle accelerator at CERN. LHC is created with main goal to search elementary particles and help science investigate our universe. Radiation in LHC is caused by charged particles circular acceleration, therefore detectors tracing particles in existed severe conditions during the experiments must be radiation tolerant. Moreover, further upgrade of luminosity (up to 1035 cm-2s-1) requires development of particle detector’s structure. This work is dedicated to show the new type 3D stripixel detector with serious structural improvement. The new type of radiation-hard detector has a three-dimensional (3D) array of the p+ and n+ electrodes that penetrate into the detector bulk. The electrons and holes are then collected at oppositely biased electrodes. Proposed 3D stripixel detector demonstrates that full depletion voltage is lower that that for planar detectors. Low depletion voltage is one of the main advantages because only depleted part of the device is active are. Because of small spacing between electrodes, charge collection distances are smaller which results in high speed of the detector’s response. In this work is also briefly discussed dual-column type detectors, meaning consisting both n+ and p+ type columnar electrodes in its structure, and was declared that dual-column detectors show better electric filed distribution then single sided radiation detectors. The dead space or in other words low electric field region in significantly suppressed. Simulations were carried out by using Atlas device simulation software. As a simulation results in this work are represented the electric field distribution under different bias voltages.
Resumo:
Aging is associated with common conditions, including cancer, diabetes, cardiovascular disease, and Alzheimer"s disease. The type of multi‐targeted pharmacological approach necessary to address a complex multifaceted disease such as aging might take advantage of pleiotropic natural polyphenols affecting a wide variety of biological processes. We have recently postulated that the secoiridoids oleuropein aglycone (OA) and decarboxymethyl oleuropein aglycone (DOA), two complex polyphenols present in health‐promoting extra virgin olive oil (EVOO), might constitute a new family of plant‐produced gerosuppressant agents. This paper describes an analysis of the biological activity spectra (BAS) of OA and DOA using PASS (Prediction of Activity Spectra for Substances) software. PASS can predict thousands of biological activities, as the BAS of a compound is an intrinsic property that is largely dependent on the compound"s structure and reflects pharmacological effects, physiological and biochemical mechanisms of action, and specific toxicities. Using Pharmaexpert, a tool that analyzes the PASS‐predicted BAS of substances based on thousands of"mechanism‐ effect" and"effect‐mechanism" relationships, we illuminate hypothesis‐generating pharmacological effects, mechanisms of action, and targets that might underlie the anti‐aging/anti‐cancer activities of the gerosuppressant EVOO oleuropeins.
Resumo:
Tämä diplomityö käsittelee teollisen yrityksen tuotannonohjauksen kehittämistä piensarjatuotannossa. Työn kohteena on ABB Oy:n Tuulivoimageneraattorit-tulosyksikkö, joka valmistaa vakiotuotteita asiakasohjautuvasti. Työssä esitellään aluksi tuotannon ja tuotannonohjauksen teoriaa. Lävitse käydään perusasioiden kuten määritelmien, tavoitteiden ja tehtävien lisäksi tuotannonohjausprosessia sekä tuotannonohjauksen tietotekniikkaa. Teorian jälkeisessä empiriaosuudessa esitellään työssä kehitettyjä keinoja tuotannonohjauksen parantamiseksi. Tutkimus on toteutettu teoreettisen ja empiirisen tutkimustyön avulla. Teoreettiseen tutkimustyöhön sisältyi suomalaisiin ja ulkomaalaisiin kirjallisuuslähteisiin perehtyminen. Empiirinen tutkimustyö suoritettiin itsenäisen ongelman ratkaisutyön avulla. Tämä sisälsi kehittämiskohteiden analysoinnin, tarkempien kehittämistarpeiden määrityksen sekä kokeilujen kautta tapahtuneen kehittämistyön. Tutkimuksen päätavoitteena oli selvittää, miten tuotannonohjauksen kehittämisellä voidaan parantaa kohteena olevan tulosyksikön tuottavuutta ja kannattavuutta. Päätavoitteen pohjalta muodostettiin kuusi osatavoitetta: toimitusvarmuuden parantaminen, kapasiteetin kuormitusasteen nostaminen, kapasiteetin suunnittelun kehittäminen, läpäisyaikojen lyhentäminen, uuden ERP-järjestelmän vaatimusmäärittely sekä tuotannonohjausprosessin määrittäminen. Työssä rakennettiin neljään ensiksi mainittuun osatavoitteeseen tietotekniset sovellukset, jotka mahdollistavat osatavoitteiden suunnittelun ja ohjaamisen. Sovelluksia varten kullekin tuotteelle määriteltiin esimerkiksi työnvaiheketjut läpäisyaikoineen, kuormitusryhmät, kuormitusryhmien kapasiteetit, tuotteiden kuormittavuudet sekä kriittiset työvälineet. Työ osoitti, että tietotekniikka auttaa suuresti tuotannonohjauksessa. Lisääntynyt läpinäkyvyys, parantunut tiedonkulku, simulointimahdollisuudet sekä graafinen esitystapa helpottavat erilaisten suunnitelmien teossa ja parantavat siten päätöksenteon laatua. Tietotekniikan hyväksikäytön pohjana toimii tuotannon perus- ja tapahtumatietojen kurinalainen päivitys. Tämän vuoksi tietojärjestelmistä kannattaa rakentaa mahdollisimman yksinkertaisia.
Resumo:
Major challenges must be tackled for brain-computer interfaces to mature into an established communications medium for VR applications, which will range from basic neuroscience studies to developing optimal peripherals and mental gamepads and more efficient brain-signal processing techniques.
Resumo:
A brain-computer interface (BCI) is a new communication channel between the human brain and a computer. Applications of BCI systems comprise the restoration of movements, communication and environmental control. In this study experiments were made that used the BCI system to control or to navigate in virtual environments (VE) just by thoughts. BCI experiments for navigation in VR were conducted so far with synchronous BCI and asynchronous BCI systems. The synchronous BCI analyzes the EEG patterns in a predefined time window and has 2 to 3 degrees of freedom.
Resumo:
The use of two-dimensional spectral analysis applied to terrain heights in order to determine characteristic terrain spatial scales and its subsequent use for the objective definition of an adequate grid size required to resolve terrain forcing are presented in this paper. In order to illustrate the influence of grid size, atmospheric flow in a complex terrain area of the Spanish east coast is simulated by the Regional Atmospheric Modeling System (RAMS) mesoscale numerical model using different horizontal grid resolutions. In this area, a grid size of 2 km is required to account for 95% of terrain variance. Comparison among results of the different simulations shows that, although the main wind behavior does not change dramatically, some small-scale features appear when using a resolution of 2 km or finer. Horizontal flow pattern differences are significant both in the nighttime, when terrain forcing is more relevant, and in the daytime, when thermal forcing is dominant. Vertical structures also are investigated, and results show that vertical advection is influenced highly by the horizontal grid size during the daytime period. The turbulent kinetic energy and potential temperature vertical cross sections show substantial differences in the structure of the planetary boundary layer for each model configuration
Resumo:
The speed of traveling fronts for a two-dimensional model of a delayed reactiondispersal process is derived analytically and from simulations of molecular dynamics. We show that the one-dimensional (1D) and two-dimensional (2D) versions of a given kernel do not yield always the same speed. It is also shown that the speeds of time-delayed fronts may be higher than those predicted by the corresponding non-delayed models. This result is shown for systems with peaked dispersal kernels which lead to ballistic transport
Resumo:
The aim of this paper was to present a simple and fast way of simulating Nuclear Magnetic Resonance signals using the Bloch equations. These phenomenological equations describe the classical behavior of macroscopic magnetization and are easily simulated using rotation matrices. Many NMR pulse sequences can be simulated with this formalism, allowing a quantitative description of the influence of many experimental parameters. Finally, the paper presents simulations of conventional sequences such as Single Pulse, Inversion Recovery, Spin Echo and CPMG.