915 resultados para empirical shell model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Accurate modelling of automotive occupant posture is strongly related to the mechanical interaction between human body soft tissue and flexible seat components. This paper presents a finite-element study simulating the deflection of seat cushion foam and supportive seat structures, as well as human buttock and thigh soft tissue when seated. The thigh-buttock surface shell model was based on 95th percentile male subject scan data and made of two layers, covering thin to moderate thigh and buttock proportions. To replicate the effects of skin and fat, the neoprene rubber layer was modelled as a hyperelastic material with viscoelastic behaviour. The analytical seat model is based on a Ford production seat. The result of the finite-element indentation simulation is compared to a previous simulation of an indentation with a hard shell human model of equal geometry, and to the physical indentation result. We conclude that SAE composite buttock form and human-seat indentation of a suspended seat cushion can be validly simulated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pion photoproduction processes14Ngs(gamma, pgr +)14C and14Ngs(gamma, pgr –)14O have been studied in the threshold region. These processes provide an excellent tool to study the corrections to soft pion theorems and Kroll-Ruderman limit as applied to nuclear processes. The agreement with the available experimental data for these processes is better with the empirical wave functions while the shell-model wave functions predict a much higher value. Detailed experimental studies of these reactions at threshold, it is shown, are expected to lead to a better understanding of the shell-model inputs and radial distributions in the 1p state. We thank Dr. S.C.K. Nair for a helpful discussion during the initial stages of this work. One of us (MVN) thanks Dr. J.M. Laget for sending some unpublished data on pion photoproduction. He is also thankful to Dr. J. Pasupathy and Dr. R. Rajaraman for their interest and encouragement.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

EXTRACT (SEE PDF FOR FULL ABSTRACT): We describe an empirical-statistical model of climates of the southwestern United States. Boundary conditions include sea surface temperatures, atmospheric transmissivity, and topography. Independent variables are derived from the boundary conditions along 1000-km paths of atmospheric circulation. ... Predictor equations are derived over a larger region than the application area to allow for the increased range of paleoclimate. This larger region is delimited by the autocorrelation properties of climatic data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The shell effect is included in the improved isospin dependent quantum molecular dynamics model in which the shell correction energy of the system is calculated by using the deformed two-center shell model. A switch function is introduced to connect the shell correction energy of the projectile and the target with that of the compound nucleus during the dynamical fusion process. It is found that the calculated capture cross sections reproduce the experimental data quantitatively at the energy near the Coulomb barrier. The capture cross sections for reaction (35) (80) Br + (82) (208) Pb -> (117) (288) X are also calculated and discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using a shell model which is capable of describing the spectra of upper g(9/2)-shell nuclei close to the N = Z line, we study the structure of two isomeric states 7(+) and 21(+) in the odd-odd N = Z nucleus Ag-94. It is found that both isomeric states exhibit a large collectivity. The 7(+) state is oblately deformed, and is suggested to be a shape isomer in nature. The 21(+) state becomes isomeric because of level inversion of the 19(+) and 21(+) states due to core excitations across the N = Z = 50 shell gap. Calculation of spectroscopic quadrupole moment indicates clearly an enhancement in these states due to the core excitations. However, the present shell model calculation that produces the 19(+)-21(+) level inversion cannot accept the large-deformation picture of Mukha et al.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

lsoscalar (T = 0) plus isovector (T = 1) pairing Hamiltonian in LS-coupling. which is important for heavy N = Z nuclei, is solvable in terms of a SO(8) Lie algebra for three special values of the mixing parameter that measures the competition between the T = 0 aid T = 1 pairing. The SO(8) algebra is generated, amongst others, by the S = 1, T = 0 and S = 0, T = 1 pair creation and annihilation operators and corresponding to the three values of the mixing parameter, there are three chains of subalgebras: SO(8) superset of SOST (6) superset of SOS(3) circle times SOT(3), SO(8) superset of [SOS(5) superset of SOS(3)] circle times SOT(3) and SO(8) superset of [SOT(5) superset of SOT(3)] circle times SOS(3). Shell model Lie algebras, with only particle number conserving generators, that are complementary to these three chains of subalgebras are identified and they are used in the classification of states for a given number of nucleons. The classification problem is solved explicitly tor states with SO(8) seniority nu = 0, 1, 2, 3 and 4. Using them, hand structures in isospin space are identified for states with nu = 0, 1, 2 and 3. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aquatic species can experience different selective pressures on morphology in different flow regimes. Species inhabiting lotic regimes often adapt to these conditions by evolving low-drag (i.e., streamlined) morphologies that reduce the likelihood of dislodgment or displacement. However, hydrodynamic factors are not the only selective pressures influencing organismal morphology and shapes well suited to flow conditions may compromise performance in other roles. We investigated the possibility of morphological trade-offs in the turtle Pseudemys concinna. Individuals living in lotic environments have flatter, more streamlined shells than those living in lentic environments; however, this flatter shape may also make the shells less capable of resisting predator-induced loads. We tested the idea that ‘‘lotic’’ shell shapes are weaker than ‘‘lentic’’ shell shapes, concomitantly examining effects of sex. Geometric morphometric data were used to transform an existing finite element shell model into a series of models corresponding to the shapes of individual turtles. Models were assigned identical material properties and loaded under identical conditions, and the stresses produced by a series of eight loads were extracted to describe the strength of the shells. ‘‘Lotic’’ shell shapes produced significantly higher stresses than ‘‘lentic’’ shell shapes, indicating that the former is weaker than the latter. Females had significantly stronger shell shapes than males, although these differences were less consistent than differences between flow regimes. We conclude that, despite the potential for many-to-one mapping of shell shape onto strength, P. concinna experiences a trade-off in shell shape between hydrodynamic and mechanical performance. This trade-off may be evident in many other turtle species or any other aquatic species that also depend on a shell for defense. However, evolution of body size may provide an avenue of escape from this trade-off in some cases, as changes in size can drastically affect mechanical performance while having little effect on hydrodynamic performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Shared services is a prominent organizational arrangement for organizations, in particular for support functions. The success (or failure) of shared services is a critical concern as the move to shared services can entail large scale investment and involve fundamental organizational change. The Higher Education (HE) sector is particularly well poised to benefit from shared services as there is a need to improve organizational performance and strong potential from sharing. Through a multiple case study of shared services experiences in HE, this study identifies ten important antecedents of shared services success: (1) Understanding of shared services; (2) Organizational environment; (3) Top management support; (4) IT environment; (5) Governance; (6) Process centric view; (7) Implementation strategy; (8) Project management; (9) Change management; and (10) Communication. The study then develops a preliminary model of shared services success that addresses the interdependencies between the success factors. As the first empirical success model for shared services, it provides valuable guidance to practice and future research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – Integrated supplier management (ISM), new product development (NPD) and knowledge sharing (KS) practices are three primary business activities utilised to enhance manufacturers' business performance (BP). The purpose of this paper is to empirically investigate the relationships between these three business activities (i.e. ISM, NPD, KS) and BP in a Taiwanese electronics manufacturing context. Design/methodology/approach – A questionnaire survey is first administered to a sample of electronic manufacturing companies operating in Taiwan to elicit the opinions of technical and managerial professionals regarding business activities and BP within their companies. A total of 170 respondents from 83 companies respond to the survey. Factor, correlation and path analysis are undertaken on this quantitative data set to derive the key factors which leverage business outcomes in these companies. Following empirical analysis, six semi-structured interviews are undertaken with manufacturing executives to provide qualitative insights into the underlying reasons why certain business activity factors are the strongest predictors of BP. Findings – The investigation shows that the ISM, NPD and KS constructs all play an important role in the success of company operations and creating business outcomes. Specifically, the key factors within these constructs which influenced BP are: supplier evaluation and selection; design simplification and modular design; information technology infrastructure and systems and open communication. Accordingly, sufficient financial and human resources should be allocated to these important activities to derive accelerated rates of improved BP. These findings are supported by the qualitative interviews with manufacturing executives. Originality/value – The paper depicts the pathways to improved manufacturing BP, through targeting efforts into the above-mentioned factors within the ISM, NPD and KS constructs. Based on the empirical path model, and the specific insights derived from the explanatory interviews with manufacturing executives, the paper also provides a number of practical implications for manufacturing companies seeking to enhance their BP through improved operational activities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Various reasons, such as ethical issues in maintaining blood resources, growing costs, and strict requirements for safe blood, have increased the pressure for efficient use of resources in blood banking. The competence of blood establishments can be characterized by their ability to predict the volume of blood collection to be able to provide cellular blood components in a timely manner as dictated by hospital demand. The stochastically varying clinical need for platelets (PLTs) sets a specific challenge for balancing supply with requests. Labour has been proven a primary cost-driver and should be managed efficiently. International comparisons of blood banking could recognize inefficiencies and allow reallocation of resources. Seventeen blood centres from 10 countries in continental Europe, Great Britain, and Scandinavia participated in this study. The centres were national institutes (5), parts of the local Red Cross organisation (5), or integrated into university hospitals (7). This study focused on the departments of blood component preparation of the centres. The data were obtained retrospectively by computerized questionnaires completed via Internet for the years 2000-2002. The data were used in four original articles (numbered I through IV) that form the basis of this thesis. Non-parametric data envelopment analysis (DEA, II-IV) was applied to evaluate and compare the relative efficiency of blood component preparation. Several models were created using different input and output combinations. The focus of comparisons was on the technical efficiency (II-III) and the labour efficiency (I, IV). An empirical cost model was tested to evaluate the cost efficiency (IV). Purchasing power parities (PPP, IV) were used to adjust the costs of the working hours and to make the costs comparable among countries. The total annual number of whole blood (WB) collections varied from 8,880 to 290,352 in the centres (I). Significant variation was also observed in the annual volume of produced red blood cells (RBCs) and PLTs. The annual number of PLTs produced by any method varied from 2,788 to 104,622 units. In 2002, 73% of all PLTs were produced by the buffy coat (BC) method, 23% by aphaeresis and 4% by the platelet-rich plasma (PRP) method. The annual discard rate of PLTs varied from 3.9% to 31%. The mean discard rate (13%) remained in the same range throughout the study period and demonstrated similar levels and variation in 2003-2004 according to a specific follow-up question (14%, range 3.8%-24%). The annual PLT discard rates were, to some extent, associated with production volumes. The mean RBC discard rate was 4.5% (range 0.2%-7.7%). Technical efficiency showed marked variation (median 60%, range 41%-100%) among the centres (II). Compared to the efficient departments, the inefficient departments used excess labour resources (and probably) production equipment to produce RBCs and PLTs. Technical efficiency tended to be higher when the (theoretical) proportion of lost WB collections (total RBC+PLT loss) from all collections was low (III). The labour efficiency varied remarkably, from 25% to 100% (median 47%) when working hours were the only input (IV). Using the estimated total costs as the input (cost efficiency) revealed an even greater variation (13%-100%) and overall lower efficiency level compared to labour only as the input. In cost efficiency only, the savings potential (observed inefficiency) was more than 50% in 10 departments, whereas labour and cost savings potentials were both more than 50% in six departments. The association between department size and efficiency (scale efficiency) could not be verified statistically in the small sample. In conclusion, international evaluation of the technical efficiency in component preparation departments revealed remarkable variation. A suboptimal combination of manpower and production output levels was the major cause of inefficiency, and the efficiency did not directly relate to production volume. Evaluation of the reasons for discarding components may offer a novel approach to study efficiency. DEA was proven applicable in analyses including various factors as inputs and outputs. This study suggests that analytical models can be developed to serve as indicators of technical efficiency and promote improvements in the management of limited resources. The work also demonstrates the importance of integrating efficiency analysis into international comparisons of blood banking.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Out-of-plane behaviour of mortared and mortarless masonry walls with various forms of reinforcement, including unreinforced masonry as a base case is examined using a layered shell element based explicit finite element modelling method. Wall systems containing internal reinforcement, external surface reinforcement and intermittently laced reinforced concrete members and unreinforced masonry panels are considered. Masonry is modelled as a layer with macroscopic orthotropic properties; external reinforcing render, grout and reinforcing bars are modelled as distinct layers of the shell element. Predictions from the layered shell model have been validated using several out-of-plane experimental datasets reported in the literature. The model is used to examine the effectiveness of two retrofitting schemes for an unreinforced masonry wall.