967 resultados para CHANDRASEKHAR MASS MODELS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A 10 cm diameter four-stage Scheibel column with dispersed phase wetted packing sections has been constructed to study the hydrodynamics and mass transfer using the system toluene-acetone-water. The literature pertaining to the above extractor has been examined and the important phenomena such as droplet break-up and coalescence, mass transfer and backmixing have been reviewed. A critical analysis of the backmixing or axial mixing models and the corresponding techniques for parameter estimation was applied and an optimization technique based on Marquardt's algorithm was implemented. A single phase sampling technique was developed to estimate the acetone concentration profile in both phases along the column. Column flooding characteristics were investigated under various operating conditions and it was found that, when the impellers were located at about DI/5cm from the upper surface of the pads, the limiting flow rates increased with impeller speed. This unusual behaviour was explained in terms of the pumping effect created by the turbine impellers. Correlations were developed to predict Sauter mean drop diameters. A five-cell with backflow model was used to estimate the column performance (stage efficiency) and phases non-ideality (backflow parameters). Overall mass transfer coefficients were computed using the above model and compared with those calculated using the correlations based on single drop mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale disasters are constantly occurring around the world, and in many cases evacuation of regions of city is needed. ‘Operational Research/Management Science’ (OR/MS) has been widely used in emergency planning for over five decades. Warning dissemination, evacuee transportation and shelter management are three ‘Evacuation Support Functions’ (ESF) generic to many hazards. This thesis has adopted a case study approach to illustrate the importance of integrated approach of evacuation planning and particularly the role of OR/MS models. In the warning dissemination phase, uncertainty in the household’s behaviour as ‘warning informants’ has been investigated along with uncertainties in the warning system. An agentbased model (ABM) was developed for ESF-1 with households as agents and ‘warning informants’ behaviour as the agent behaviour. The model was used to study warning dissemination effectiveness under various conditions of the official channel. In the transportation phase, uncertainties in the household’s behaviour such as departure time (a function of ESF-1), means of transport and destination have been. Households could evacuate as pedestrians, using car or evacuation buses. An ABM was developed to study the evacuation performance (measured in evacuation travel time). In this thesis, a holistic approach for planning the public evacuation shelters called ‘Shelter Information Management System’ (SIMS) has been developed. A generic allocation framework of was developed to available shelter capacity to the shelter demand by considering the evacuation travel time. This was formulated using integer programming. In the sheltering phase, the uncertainty in household shelter choices (either nearest/allocated/convenient) has been studied for its impact on allocation policies using sensitivity analyses. Using analyses from the models and detailed examination of household states from ‘warning to safety’, it was found that the three ESFs though sequential in time, however have lot of interdependencies from the perspective of evacuation planning. This thesis has illustrated an OR/MS based integrated approach including and beyond single ESF preparedness. The developed approach will help in understanding the inter-linkages of the three evacuation phases and preparing a multi-agency-based evacuation planning evacuation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a discipline, supply chain management (SCM) has traditionally been primarily concerned with the procurement, processing, movement and sale of physical goods. However an important class of products has emerged - digital products - which cannot be described as physical as they do not obey commonly understood physical laws. They do not possess mass or volume, and they require no energy in their manufacture or distribution. With the Internet, they can be distributed at speeds unimaginable in the physical world, and every copy produced is a 100% perfect duplicate of the original version. Furthermore, the ease with which digital products can be replicated has few analogues in the physical world. This paper assesses the effect of non-physicality on one such product – software – in relation to the practice of SCM. It explores the challenges that arise when managing the software supply chain and how practitioners are addressing these challenges. Using a two-pronged exploratory approach that examines the literature around software management as well as direct interviews with software distribution practitioners, a number of key challenges associated with software supply chains are uncovered, along with responses to these challenges. This paper proposes a new model for software supply chains that takes into account the non-physicality of the product being delivered. Central to this model is the replacement of physical flows with flows of intellectual property, the growing importance of innovation over duplication and the increased centrality of the customer in the entire process. Hybrid physical / digital supply chains are discussed and a framework for practitioners concerned with software supply chains is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The E01-011 experiment at Jefferson Laboratory (JLab) studied light-to-medium mass Λ hypernuclei via the AZ + e → [special characters omitted] + e' + K+ electroproduction reaction. Precise measurement of hypernuclear ground state masses and excitation energies provides information about the nature of hyperon-nucleon interactions. Until recently, hypernuclei were studied at accelerator facilities with intense π+ and K- meson beams. The poor quality of these beams limited the resolution of the hypernuclear excitation energy spectra to about 1.5 MeV (FWHM). This resolution is not sufficient for resolving the rich structure observed in the excitation spectra. By using a high quality electron beam and employing a new high resolution spectrometer system, this study aims to improve the resolution to a few hundred keV with an absolute precision of about 100 keV for excitation energies. In this work the high-resolution excitation spectra of [special characters omitted], and [special characters omitted] hypernuclei are presented. In an attempt to emphasize the presence of the core-excited states we introduced a novel likelihood approach to particle identification (PID) to serve as an alternative to the commonly used standard hard-cut PID. The new method resulted in almost identical missing mass spectra as obtained by the standard approach. An energy resolution of approximately 400–500 keV (FWHM) has been achieved, an unprecedented value in hypernuclear reaction spectroscopy. For [special characters omitted] the core-excited configuration has been clearly observed with significant statistics. The embedded Λ hyperon increases the excitation energies of the 11B nuclear core by 0.5–1 MeV. The [special characters omitted] spectrum has been observed with significant statistics for the first time. The ground state is bound deeper by roughly 400 keV than currently predicted by theory. Indication for the core-excited doublet, which is unbound in the core itself, is observed. The measurement of [special characters omitted] provides the first study of a d-shell hypernucleus with sub-MeV resolution. Discrepancies of up to 2 MeV between measured and theoretically predicted binding energies are found. Similar disagreement exists when comparing to the [special characters omitted] mirror hypernucleus. Also the core-excited structure observed between the major s-, p- and d-shell Λ orbits is not consistent with the available theoretical calculations. In conclusion, the discrepancies found in this study will provide valuable input for the further development of theoretical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The northern Antarctic Peninsula is one of the fastest changing regions on Earth. The disintegration of the Larsen-A Ice Shelf in 1995 caused tributary glaciers to adjust by speeding up, surface lowering, and overall increased ice-mass discharge. In this study, we investigate the temporal variation of these changes at the Dinsmoor-Bombardier-Edgeworth glacier system by analyzing dense time series from various spaceborne and airborne Earth observation missions. Precollapse ice shelf conditions and subsequent adjustments through 2014 were covered. Our results show a response of the glacier system some months after the breakup, reaching maximum surface velocities at the glacier front of up to 8.8 m/d in 1999 and a subsequent decrease to ~1.5 m/d in 2014. Using a dense time series of interferometrically derived TanDEM-X digital elevation models and photogrammetric data, an exponential function was fitted for the decrease in surface elevation. Elevation changes in areas below 1000 m a.s.l. amounted to at least 130±15 m130±15 m between 1995 and 2014, with change rates of ~3.15 m/a between 2003 and 2008. Current change rates (2010-2014) are in the range of 1.7 m/a. Mass imbalances were computed with different scenarios of boundary conditions. The most plausible results amount to -40.7±3.9 Gt-40.7±3.9 Gt. The contribution to sea level rise was estimated to be 18.8±1.8 Gt18.8±1.8 Gt, corresponding to a 0.052±0.005 mm0.052±0.005 mm sea level equivalent, for the period 1995-2014. Our analysis and scenario considerations revealed that major uncertainties still exist due to insufficiently accurate ice-thickness information. The second largest uncertainty in the computations was the glacier surface mass balance, which is still poorly known. Our time series analysis facilitates an improved comparison with GRACE data and as input to modeling of glacio-isostatic uplift in this region. The study contributed to a better understanding of how glacier systems adjust to ice shelf disintegration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oxygen and carbon isotope measurements were carried out on tests of planktic foraminifers N. pachyderma (sin.) from eight sediment cores taken from the eastern Arctic Ocean, the Fram Strait, and the lceland Sea, in order to reconstruct Arctic Ocean and Norwegian-Greenland Sea circulation patterns and ice covers during the last 130,000 years. In addition, the influence of ice, temperature and salinity effects on the isotopic signal was quantified. Isotope measurements on foraminifers from sediment surface samples were used to elucidate the ecology of N. pachyderma (sin.). Changes in the oxygen and carbon isotope composition of N. pachyderma (sin.) from sediment surface samples document the horizontal and vertical changes of water mass boundaries controlled by water temperature and salinity, because N. pachyderma (sin.) shows drastic changes in depth habitats, depending on the water mass properties. It was able to be shown that in the investigated areas a regional and spatial apparent increase of the ice effect occurred. This happened especially during the termination I by direct advection of meltwaters from nearby continents or during the termination and in interglacials by supply of isotopically light water from rivers. A northwardly proceeding overprint of the 'global' ice effect, increasing from the Norwegian-Greenland Sea to the Arctic Ocean, was not able to be demonstrated. By means of a model the influence of temperature and salinity on the global ice volume signal during the last 130,000 years was recorded. In combination with the results of this study, the model was the basis for a reconstruction of the paleoceanographic development of the Arctic Ocean and the Norwegian-Greenland Sea during this time interval. The conception of a relatively thick and permanent sea ice cover in the Nordic Seas during glacial times should be replaced by the model of a seasonally and regionally highly variable ice cover. Only during isotope stage 5e may there have been a local deep water formation in the Fram Strait.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new modality for preventing HIV transmission is emerging in the form of topical microbicides. Some clinical trials have shown some promising results of these methods of protection while other trials have failed to show efficacy. Due to the relatively novel nature of microbicide drug transport, a rigorous, deterministic analysis of that transport can help improve the design of microbicide vehicles and understand results from clinical trials. This type of analysis can aid microbicide product design by helping understand and organize the determinants of drug transport and the potential efficacies of candidate microbicide products.

Microbicide drug transport is modeled as a diffusion process with convection and reaction effects in appropriate compartments. This is applied here to vaginal gels and rings and a rectal enema, all delivering the microbicide drug Tenofovir. Although the focus here is on Tenofovir, the methods established in this dissertation can readily be adapted to other drugs, given knowledge of their physical and chemical properties, such as the diffusion coefficient, partition coefficient, and reaction kinetics. Other dosage forms such as tablets and fiber meshes can also be modeled using the perspective and methods developed here.

The analyses here include convective details of intravaginal flows by both ambient fluid and spreading gels with different rheological properties and applied volumes. These are input to the overall conservation equations for drug mass transport in different compartments. The results are Tenofovir concentration distributions in time and space for a variety of microbicide products and conditions. The Tenofovir concentrations in the vaginal and rectal mucosal stroma are converted, via a coupled reaction equation, to concentrations of Tenofovir diphosphate, which is the active form of the drug that functions as a reverse transcriptase inhibitor against HIV. Key model outputs are related to concentrations measured in experimental pharmacokinetic (PK) studies, e.g. concentrations in biopsies and blood. A new measure of microbicide prophylactic functionality, the Percent Protected, is calculated. This is the time dependent volume of the entire stroma (and thus fraction of host cells therein) in which Tenofovir diphosphate concentrations equal or exceed a target prophylactic value, e.g. an EC50.

Results show the prophylactic potentials of the studied microbicide vehicles against HIV infections. Key design parameters for each are addressed in application of the models. For a vaginal gel, fast spreading at small volume is more effective than slower spreading at high volume. Vaginal rings are shown to be most effective if inserted and retained as close to the fornix as possible. Because of the long half-life of Tenofovir diphosphate, temporary removal of the vaginal ring (after achieving steady state) for up to 24h does not appreciably diminish Percent Protected. However, full steady state (for the entire stromal volume) is not achieved until several days after ring insertion. Delivery of Tenofovir to the rectal mucosa by an enema is dominated by surface area of coated mucosa and whether the interiors of rectal crypts are filled with the enema fluid. For the enema 100% Percent Protected is achieved much more rapidly than for vaginal products, primarily because of the much thinner epithelial layer of the mucosa. For example, 100% Percent Protected can be achieved with a one minute enema application, and 15 minute wait time.

Results of these models have good agreement with experimental pharmacokinetic data, in animals and clinical trials. They also improve upon traditional, empirical PK modeling, and this is illustrated here. Our deterministic approach can inform design of sampling in clinical trials by indicating time periods during which significant changes in drug concentrations occur in different compartments. More fundamentally, the work here helps delineate the determinants of microbicide drug delivery. This information can be the key to improved, rational design of microbicide products and their dosage regimens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conventional mechanism of fermion mass generation in the Standard Model involves Spontaneous Symmetry Breaking (SSB). In this thesis, we study an alternate mechanism for the generation of fermion masses that does not require SSB, in the context of lattice field theories. Being inherently strongly coupled, this mechanism requires a non-perturbative approach like the lattice approach.

In order to explore this mechanism, we study a simple lattice model with a four-fermion interaction that has massless fermions at weak couplings and massive fermions at strong couplings, but without any spontaneous symmetry breaking. Prior work on this type of mass generation mechanism in 4D, was done long ago using either mean-field theory or Monte-Carlo calculations on small lattices. In this thesis, we have developed a new computational approach that enables us to perform large scale quantum Monte-Carlo calculations to study the phase structure of this theory. In 4D, our results confirm prior results, but differ in some quantitative details of the phase diagram. In contrast, in 3D, we discover a new second order critical point using calculations on lattices up to size $ 60^3$. Such large scale calculations are unprecedented. The presence of the critical point implies the existence of an alternate mechanism of fermion mass generation without any SSB, that could be of interest in continuum quantum field theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Secondary organic aerosol (SOA) accounts for a dominant fraction of the submicron atmospheric particle mass, but knowledge of the formation, composition and climate effects of SOA is incomplete and limits our understanding of overall aerosol effects in the atmosphere. Organic oligomers were discovered as dominant components in SOA over a decade ago in laboratory experiments and have since been proposed to play a dominant role in many aerosol processes. However, it remains unclear whether oligomers are relevant under ambient atmospheric conditions because they are often not clearly observed in field samples. Here we resolve this long-standing discrepancy by showing that elevated SOA mass is one of the key drivers of oligomer formation in the ambient atmosphere and laboratory experiments. We show for the first time that a specific organic compound class in aerosols, oligomers, is strongly correlated with cloud condensation nuclei (CCN) activities of SOA particles. These findings might have important implications for future climate scenarios where increased temperatures cause higher biogenic volatile organic compound (VOC) emissions, which in turn lead to higher SOA mass formation and significant changes in SOA composition. Such processes would need to be considered in climate models for a realistic representation of future aerosol-climate-biosphere feedbacks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on detailed reconstructions of global distribution patterns, both paleoproductivity and the benthic d13C record of CO2, which is dissolved in the deep ocean, strongly differed between the Last Glacial Maximum and the Holocene. With the onset of Termination I about 15,000 years ago, the new (export) production of low- and mid-latitude upwelling cells started to decline by more than 2-4 Gt carbon/year. This reduction is regarded as a main factor leading to both the simultaneous rise in atmospheric CO2 as recorded in ice cores and, with a slight delay of more than 1000 years, to a large-scale gradual CO2 depletion of the deep ocean by about 650 Gt C. This estimate is based on an average increase in benthic d13C by 0.4-0.5 per mil. The decrease in new production also matches a clear 13C depletion of organic matter, possibly recording an end of extreme nutrient utilization in upwelling cells. As shown by Sarnthein et al., [1987], the productivity reversal appears to be triggered by a rapid reduction in the strength of meridional trades, which in turn was linked via a shrinking extent of sea ice to a massive increase in high-latitude insolation, i.e., to orbital forcing as primary cause.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a mechanism for testing the theory of collapse models such as continuous spontaneous localization (CSL) by examining the parametric heating rate of a trapped nanosphere. The random localizations of the center-of-mass for a given particle predicted by the CSL model can be understood as a stochastic force embodying a source of heating for the nanosphere. We show that by utilising a Paul trap to levitate the particle and optical cooling, it is possible to reduce environmental decoher- ence to such a level that CSL dominates the dynamics and contributes the main source of heating. We show that this approach allows measurements to be made on the timescale of seconds, and that the free parameter λcsl which characterises the model ought to be testable to values as low as 10^{−12} Hz.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of neutrino-driven core-collapse supernova explosions have matured considerably in recent years. Explosions of low-mass progenitors can routinely be simulated in 1D, 2D, and 3D. Nucleosynthesis calculations indicate that these supernovae could be contributors of some lighter neutron-rich elements beyond iron. The explosion mechanism of more massive stars remains under investigation, although first 3D models of neutrino-driven explosions employing multi-group neutrino transport have become available. Together with earlier 2D models and more simplified 3D simulations, these have elucidated the interplay between neutrino heating and hydrodynamic instabilities in the post-shock region that is essential for shock revival. However, some physical ingredients may still need to be added/improved before simulations can robustly explain supernova explosions over a wide range of progenitors. Solutions recently suggested in the literature include uncertainties in the neutrino rates, rotation, and seed perturbations from convective shell burning. We review the implications of 3D simulations of shell burning in supernova progenitors for the ‘perturbations-aided neutrino-driven mechanism,’ whose efficacy is illustrated by the first successful multi-group neutrino hydrodynamics simulation of an 18 solar mass progenitor with 3D initial conditions. We conclude with speculations about the impact of 3D effects on the structure of massive stars through convective boundary mixing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of Mass Customization (MC) - producing customised goods for a mass market - has received considerable attention in the research literature in recent years. However the literature is limited in providing an understanding of the content of MC strategies (the organizational structures, process technologies, etc., that are best in a particular environment) and the process of MC strategies (the sub-strategy that an enterprise should select and how they should go about implementing an MC strategy). In this paper six published classification schemes of relevance to Mass Customization are reviewed. The classification schemes are applied to five case studies of enterprises operating in an MC environment. The limitations of the schemes are analysed and their failure to distinguish key characteristics is highlighted. Analysis of the findings leads to the development of a taxonomy of operational modes for MC. Five fundamental modes of operation for Mass Customization are identified. These modes are described and justified and their application is illustrated by contrasting the information requirements of two modes. The potential of these modes to provide the foundations for detailed configurations models is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is increasing interest in evaluating the environmental effects on crop architectural traits and yield improvement. However, crop models describing the dynamic changes in canopy structure with environmental conditions and the complex interactions between canopy structure, light interception, and dry mass production are only gradually emerging. Using tomato (Solanum lycopersicum L.) as a model crop, a dynamic functional-structural plant model (FSPM) was constructed, parameterized, and evaluated to analyse the effects of temperature on architectural traits, which strongly influence canopy light interception and shoot dry mass. The FSPM predicted the organ growth, organ size, and shoot dry mass over time with high accuracy (>85%). Analyses of this FSPM showed that, in comparison with the reference canopy, shoot dry mass may be affected by leaf angle by as much as 20%, leaf curvature by up to 7%, the leaf length: width ratio by up to 5%, internode length by up to 9%, and curvature ratios and leaf arrangement by up to 6%. Tomato canopies at low temperature had higher canopy density and were more clumped due to higher leaf area and shorter internodes. Interestingly, dry mass production and light interception of the clumped canopy were more sensitive to changes in architectural traits. The complex interactions between architectural traits, canopy light interception, dry mass production, and environmental conditions can be studied by the dynamic FSPM, which may serve as a tool for designing a canopy structure which is 'ideal' in a given environment.