950 resultados para Mathematical and statistical techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review and structure some of the mathematical and statistical models that have been developed over the past half century to grapple with theoretical and experimental questions about the stochastic development of aging over the life course. We suggest that the mathematical models are in large part addressing the problem of partitioning the randomness in aging: How does aging vary between individuals, and within an individual over the lifecourse? How much of the variation is inherently related to some qualities of the individual, and how much is entirely random? How much of the randomness is cumulative, and how much is merely short-term flutter? We propose that recent lines of statistical inquiry in survival analysis could usefully grapple with these questions, all the more so if they were more explicitly linked to the relevant mathematical and biological models of aging. To this end, we describe points of contact among the various lines of mathematical and statistical research. We suggest some directions for future work, including the exploration of information-theoretic measures for evaluating components of stochastic models as the basis for analyzing experiments and anchoring theoretical discussions of aging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cities, which are now inhabited by a majority of the world's population, are not only an important source of global environmental and resource depletion problems, but can also act as important centres of technological innovation and social learning in the continuing quest for a low carbon future. Planning and managing large-scale transitions in cities to deal with these pressures require an understanding of urban retrofitting at city scale. In this context performative techniques (such as backcasting and roadmapping) can provide valuable tools for helping cities develop a strategic view of the future. However, it is also important to identify ‘disruptive’ and ‘sustaining’ technologies which may contribute to city-based sustainability transitions. This paper presents research findings from the EPSRC Retrofit 2050 project, and explores the relationship between technology roadmaps and transition theory literature, highlighting the research gaps at urban/city level. The paper develops a research methodology to describe the development of three guiding visions for city-regional retrofit futures, and identifies key sustaining and disruptive technologies at city scale within these visions using foresight (horizon scanning) techniques. The implications of the research for city-based transition studies and related methodologies are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the initialization of Northern-hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates significantly reduces assimilation error both in identical-twin experiments and when assimilating sea-ice observations, reducing the concentration error by a factor of four to six, and the thickness error by a factor of two. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that the strong dependence of thermodynamic ice growth on ice concentration necessitates an adjustment of mean ice thickness in the analysis update. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that proportional mean-thickness updates are superior to the other two methods considered and enable us to assimilate sea ice in a global climate model using simple Newtonian relaxation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the initialisation of Northern Hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates leads to good assimilation performance for sea-ice concentration and thickness, both in identical-twin experiments and when assimilating sea-ice observations. The simulation of other Arctic surface fields in the coupled model is, however, not significantly improved by the assimilation. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that an adjustment of mean ice thickness in the analysis update is essential to arrive at plausible state estimates. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that assimilation with proportional mean-thickness updates outperforms the other two methods considered. The method described here is very simple to implement, and gives results that are sufficiently good to be used for initialising sea ice in a global climate model for seasonal to decadal predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The congruential rule advanced by Graves for polarization basis transformation of the radar backscatter matrix is now often misinterpreted as an example of consimilarity transformation. However, consimilarity transformations imply a physically unrealistic antilinear time-reversal operation. This is just one of the approaches found in literature to the description of transformations where the role of conjugation has been misunderstood. In this paper, the different approaches are examined in particular in respect to the role of conjugation. In order to justify and correctly derive the congruential rule for polarization basis transformation and properly place the role of conjugation, the origin of the problem is traced back to the derivation of the antenna height from the transmitted field. In fact, careful consideration of the role played by the Green’s dyadic operator relating the antenna height to the transmitted field shows that, under general unitary basis transformation, it is not justified to assume a scalar relationship between them. Invariance of the voltage equation shows that antenna states and wave states must in fact lie in dual spaces, a distinction not captured in conventional Jones vector formalism. Introducing spinor formalism, and with the use of an alternate spin frame for the transmitted field a mathematically consistent implementation of the directional wave formalism is obtained. Examples are given comparing the wider generality of the congruential rule in both active and passive transformations with the consimilarity rule.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of this work was to evaluate thermodynamic parameters of the soybean oil extraction process using ethanol as solvent. The experimental treatments were as follows: aqueous solvents with water contents varying from 0 to 13% (mass basis) and extraction temperature varying from 50 to 100 degrees C. The distribution coefficients of oil at equilibrium have been used to calculate enthalpy, entropy and free energy changes. The results indicate that oil extraction process with ethanol is feasible and spontaneous, mainly under higher temperature. Also, the influence of water level in the solvent and temperature were analysed using the response surface methodology (RSM). It can be noted that the extraction yield was highly affected by both independent variables. A joint analysis of thermodynamic and RSM indicates the optimal level of solvent hydration and temperature to perform the extraction process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Royal palm tree peroxidase (RPTP) is a very stable enzyme in regards to acidity, temperature, H(2)O(2), and organic solvents. Thus, RPTP is a promising candidate for developing H(2)O(2)-sensitive biosensors for diverse applications in industry and analytical chemistry. RPTP belongs to the family of class III secretory plant peroxidases, which include horseradish peroxidase isozyme C, soybean and peanut peroxidases. Here we report the X-ray structure of native RPTP isolated from royal palm tree (Roystonea regia) refined to a resolution of 1.85 angstrom. RPTP has the same overall folding pattern of the plant peroxidase superfamily, and it contains one heme group and two calcium-binding sites in similar locations. The three-dimensional structure of RPTP was solved for a hydroperoxide complex state, and it revealed a bound 2-(N-morpholino) ethanesulfonic acid molecule (MES) positioned at a putative substrate-binding secondary site. Nine N-glycosylation sites are clearly defined in the RPTP electron-density maps, revealing for the first time conformations of the glycan chains of this highly glycosylated enzyme. Furthermore, statistical coupling analysis (SCA) of the plant peroxidase superfamily was performed. This sequence-based method identified a set of evolutionarily conserved sites that mapped to regions surrounding the heme prosthetic group. The SCA matrix also predicted a set of energetically coupled residues that are involved in the maintenance of the structural folding of plant peroxidases. The combination of crystallographic data and SCA analysis provides information about the key structural elements that could contribute to explaining the unique stability of RPTP. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Photovoltaic processing is one of the processes that have significance in semiconductor process line. It is complicated due to the no. of elements involved that directly or indirectly affect the processing and final yield. So mathematically or empirically we can’t say assertively about the results specially related with diffusion, antireflective coating and impurity poisoning. Here I have experimented and collected data on the mono-crystal silicon wafers with varying properties and outputs. Then by using neural network with available experimental data output required can be estimated which is further tested by the test data for authenticity. One can say that it’s a kind of process simulation with varying input of raw wafers to get desired yield of photovoltaic mono-crystal cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing costs and competitive business strategies are pushing sawmill enterprises to make an effort for optimization of their process management. Organizational decisions mainly concentrate on performance and reduction of operational costs in order to maintain profit margins. Although many efforts have been made, effective utilization of resources, optimal planning and maximum productivity in sawmill are still challenging to sawmill industries. Many researchers proposed the simulation models in combination with optimization techniques to address problems of integrated logistics optimization. The combination of simulation and optimization technique identifies the optimal strategy by simulating all complex behaviours of the system under consideration including objectives and constraints. During the past decade, an enormous number of studies were conducted to simulate operational inefficiencies in order to find optimal solutions. This paper gives a review on recent developments and challenges associated with simulation and optimization techniques. It was believed that the review would provide a perfect ground to the authors in pursuing further work in optimizing sawmill yard operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid development of data transfer through internet made it easier to send the data accurate and faster to the destination. There are many transmission media to transfer the data to destination like e-mails; at the same time it is may be easier to modify and misuse the valuable information through hacking. So, in order to transfer the data securely to the destination without any modifications, there are many approaches like cryptography and steganography. This paper deals with the image steganography as well as with the different security issues, general overview of cryptography, steganography and digital watermarking approaches.  The problem of copyright violation of multimedia data has increased due to the enormous growth of computer networks that provides fast and error free transmission of any unauthorized duplicate and possibly manipulated copy of multimedia information. In order to be effective for copyright protection, digital watermark must be robust which are difficult to remove from the object in which they are embedded despite a variety of possible attacks. The message to be send safe and secure, we use watermarking. We use invisible watermarking to embed the message using LSB (Least Significant Bit) steganographic technique. The standard LSB technique embed the message in every pixel, but my contribution for this proposed watermarking, works with the hint for embedding the message only on the image edges alone. If the hacker knows that the system uses LSB technique also, it cannot decrypt correct message. To make my system robust and secure, we added cryptography algorithm as Vigenere square. Whereas the message is transmitted in cipher text and its added advantage to the proposed system. The standard Vigenere square algorithm works with either lower case or upper case. The proposed cryptography algorithm is Vigenere square with extension of numbers also. We can keep the crypto key with combination of characters and numbers. So by using these modifications and updating in this existing algorithm and combination of cryptography and steganography method we develop a secure and strong watermarking method. Performance of this watermarking scheme has been analyzed by evaluating the robustness of the algorithm with PSNR (Peak Signal to Noise Ratio) and MSE (Mean Square Error) against the quality of the image for large amount of data. While coming to see results of the proposed encryption, higher value of 89dB of PSNR with small value of MSE is 0.0017. Then it seems the proposed watermarking system is secure and robust for hiding secure information in any digital system, because this system collect the properties of both steganography and cryptography sciences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From the geotechnical standpoint, it is interesting to analyse the soil texture in regions with rough terrain due to its relation with the infiltration and runoff processes and, consequently, the effect on erosion processes. The purpose of this paper is to present a methodology that provides the soil texture spatialization by using Fuzzy logic and Geostatistic. The results were correlated with maps drawn specifically for the study area. The knowledge of the spatialization of soil properties, such as the texture, can be an important tool for land use planning in order to reduce the potential soil losses during rain seasons. (c) 2011 Published by Elsevier Ltd. Selection and peer-review under responsibility of Spatial Statistics 2011

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to compare the artificial tooth positional changes following the flasking and polymerization of complete dentures by a combination of two flasking methods and two polymerization techniques using computer graphic measurements.Materials and Methods: Four groups of waxed complete dentures (n = 10) were invested and polymerized using the following techniques: (1) adding a second investment layer of gypsum and conventional water bath polymerization (Control), (2) adding a second investment layer of gypsum and polymerization with microwave energy (Gyp-micro), (3) adding a second investment layer of silicone (Zetalabor) and conventional polymerization (Silwater), and (4) adding a second investment layer of silicone and polymerization with microwave energy (Silmicro). For each specimen, six segments of interdental distances (A to F) were measured to determine the artificial tooth positions in the waxed and polymerized stages using software program AutoCad R14. The mean values of the changes were statistically compared by univariate ANOVA with Tukey post-hoc test at 5% significance.Results: There were no significant differences among the four groups, except for segment D of the Silmicro group (-0.004 +/- 0.032 cm) in relation to the Gypwater group (0.044 +/- 0.031 cm) (p < 0.05), which presented, repectively, expansion and shrinkage after polymerization.Conclusions: Within the limitations of this study, it was concluded that although the differences were not statistically significant, the use of a silicone investment layer when flasking complete dentures resulted in the least positional changes of the artificial teeth regardless of the polymerization technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The purpose of this in vitro study was to compare the dimensional accuracy of a stone index and of 3 impression techniques (tapered impression copings, squared impression copings, and squared impression copings splinted with acrylic resin) associated with 3 pouring techniques (conventional, pouring using latex tubes fitted onto analogs, and pouring after joining the analogs with acrylic resin) for implant-supported prostheses. Materials and Methods: A mandibular brass cast with 4 stainless steel implant-abutment analogs, a framework, and 2 aluminum custom trays were fabricated. Polyether impression material was used for all impressions. Ten groups were formed (a control group and 9 test groups formed by combining each pouring technique and impression technique). Five casts were made per group for a total of 50 casts and 200 gap values (1 gap value for each implant-abutment analog). Results: The mean gap value with the index technique was 27.07 mu m. With the conventional pouring technique, the mean gap values were 116.97 mu m for the tapered group, 5784 mu m for the squared group, and 73.17 mu m for the squared splinted group. With pouring using latex tubes, the mean gap values were 65.69 mu m for the tapered group, 38.03 mu m for the squared group, and 82.47 mu m for the squared splinted group. With pouring after joining the analogs with acrylic resin, the mean gap values were 141.12 jum for the tapered group, 74.19 mu m for the squared group, and 104.67 mu m for the squared splinted group. No significant difference was detected among Index, squarellatex techniques, and master cast (P > .05). Conclusions: The most accurate impression technique utilized squared copings. The most accurate pouring technique for making the impression with tapered or squared copings utilized latex tubes. The pouring did not influence the accuracy of the stone casts when using splinted squared impression copings. Either the index technique or the use of squared coping combined with the latex-tube pouring technique are preferred methods for making implant-supported fixed restorations with dimensional accuracy.