991 resultados para dimension reduction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present article is to analyse the Apology in its aspect of time. When defending himself against the charges, Socrates appeals to the past, the present and the future. Furthermore, the philosopher stresses the meaning of the duration of time. Thus, the seems to suggest that all really important activities demand a long time to benefit, since they are almost invariably connected with greater efforts. While the dialogue proves thereby to be an ethical one, the various time expressions also gain an ethical dimension.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To evaluate critical exposure levels and the reversibility of lead neurotoxicity a group of lead exposed foundry workers and an unexposed reference population were followed up for three years. During this period, tests designed to monitor neurobehavioural function and lead dose were administered. Evaluations of 160 workers during the first year showed dose dependent decrements in mood, visual/motor performance, memory, and verbal concept formation. Subsequently, an improvement in the hygienic conditions at the plant resulted in striking reductions in blood lead concentrations over the following two years. Attendant improvement in indices of tension (20% reduction), anger (18%), depression (26%), fatigue (27%), and confusion (13%) was observed. Performance on neurobehavioural testing generally correlated best with integrated dose estimates derived from blood lead concentrations measured periodically over the study period; zinc protoporphyrin levels were less well correlated with function. This investigation confirms the importance of compliance with workplace standards designed to lower exposures to ensure that individual blood lead concentrations remain below 50 micrograms/dl.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two new notions of reduction for terms of the λ-calculus are introduced and the question of whether a λ-term is beta-strongly normalizing is reduced to the question of whether a λ-term is merely normalizing under one of the new notions of reduction. This leads to a new way to prove beta-strong normalization for typed λ-calculi. Instead of the usual semantic proof style based on Girard's "candidats de réductibilité'', termination can be proved using a decreasing metric over a well-founded ordering in a style more common in the field of term rewriting. This new proof method is applied to the simply-typed λ-calculus and the system of intersection types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple experiment to demonstrate nucleophilic addition to a carbonyl. Sodium borohydride-mediated reduction of fluorenone is a fast and high-yielding reaction that is suitable for beginning students. Students isolate their fluorenol product by recrystallization and characterize it by NMR and IR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is an addendum to our technical report BUCS TR-94-014 of December 19, 1994. It clarifies some statements, adds information on some related research, includes a comparison with research be de Groote, and fixes two minor mistakes in a proof.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We define a unification problem ^UP with the property that, given a pure lambda-term M, we can derive an instance Gamma(M) of ^UP from M such that Gamma(M) has a solution if and only if M is beta-strongly normalizable. There is a type discipline for pure lambda-terms that characterizes beta-strong normalization; this is the system of intersection types (without a "top" type that can be assigned to every lambda-term). In this report, we use a lean version LAMBDA of the usual system of intersection types. Hence, ^UP is also an appropriate unification problem to characterize typability of lambda-terms in LAMBDA. It also follows that ^UP is an undecidable problem, which can in turn be related to semi-unification and second-order unification (both known to be undecidable).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To provide real-time service or engineer constrained-based paths, networks require the underlying routing algorithm to be able to find low-cost paths that satisfy given Quality-of-Service (QoS) constraints. However, the problem of constrained shortest (least-cost) path routing is known to be NP-hard, and some heuristics have been proposed to find a near-optimal solution. However, these heuristics either impose relationships among the link metrics to reduce the complexity of the problem which may limit the general applicability of the heuristic, or are too costly in terms of execution time to be applicable to large networks. In this paper, we focus on solving the delay-constrained minimum-cost path problem, and present a fast algorithm to find a near-optimal solution. This algorithm, called DCCR (for Delay-Cost-Constrained Routing), is a variant of the k-shortest path algorithm. DCCR uses a new adaptive path weight function together with an additional constraint imposed on the path cost, to restrict the search space. Thus, DCCR can return a near-optimal solution in a very short time. Furthermore, we use the method proposed by Blokh and Gutin to further reduce the search space by using a tighter bound on path cost. This makes our algorithm more accurate and even faster. We call this improved algorithm SSR+DCCR (for Search Space Reduction+DCCR). Through extensive simulations, we confirm that SSR+DCCR performs very well compared to the optimal but very expensive solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two classes of techniques have been developed to whiten the quantization noise in digital delta-sigma modulators (DDSMs): deterministic and stochastic. In this two-part paper, a design methodology for reduced-complexity DDSMs is presented. The design methodology is based on error masking. Rules for selecting the word lengths of the stages in multistage architectures are presented. We show that the hardware requirement can be reduced by up to 20% compared with a conventional design, without sacrificing performance. Simulation and experimental results confirm theoretical predictions. Part I addresses MultistAge noise SHaping (MASH) DDSMs; Part II focuses on single-quantizer DDSMs..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selective isoelectric whey protein precipitation and aggregation is carried out at laboratory scale in a standard configuration batch agitation vessel. Geometric scale-up of this operation is implemented on the basis of constant impeller power input per unit volume and subsequent clarification is achieved by high speed disc-stack centrifugation. Particle size and fractal geometry are important in achieving efficient separation while aggregates need to be strong enough to resist the more extreme levels of shear that are encountered during processing, for example through pumps, valves and at the centrifuge inlet zone. This study investigates how impeller agitation intensity and ageing time affect aggregate size, strength, fractal dimension and hindered settling rate at laboratory scale in order to determine conditions conducive for improved separation. Particle strength is measured by observing the effects of subjecting aggregates to moderate and high levels of process shear in a capillary rig and through a partially open ball-valve respectively. The protein precipitate yield is also investigated with respect to ageing time and impeller agitation intensity. A pilot scale study is undertaken to investigate scale-up and how agitation vessel shear affects centrifugal separation efficiency. Laboratory scale studies show that precipitates subject to higher impeller shear-rates during the addition of the precipitation agent are smaller but more compact than those subject to lower impeller agitation and are better able to resist turbulent breakage. They are thus more likely to provide a better feed for more efficient centrifugal separation. Protein precipitation yield improves significantly with ageing, and 50 minutes of ageing is required to obtain a 70 - 80% yield of α-lactalbumin. Geometric scale-up of the agitation vessel at constant power per unit volume results in aggregates of broadly similar size exhibiting similar trends but with some differences due to the absence of dynamic similarity due to longer circulation time and higher tip speed in the larger vessel. Disc stack centrifuge clarification efficiency curves show aggregates formed at higher shear-rates separate more efficiently, in accordance with laboratory scale projections. Exposure of aggregates to highly turbulent conditions, even for short exposure times, can lead to a large reduction in particle size. Thus, improving separation efficiencies can be achieved by the identification of high shear zones in a centrifugal process and the subsequent elimination or amelioration of such.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Error correcting codes are combinatorial objects, designed to enable reliable transmission of digital data over noisy channels. They are ubiquitously used in communication, data storage etc. Error correction allows reconstruction of the original data from received word. The classical decoding algorithms are constrained to output just one codeword. However, in the late 50’s researchers proposed a relaxed error correction model for potentially large error rates known as list decoding. The research presented in this thesis focuses on reducing the computational effort and enhancing the efficiency of decoding algorithms for several codes from algorithmic as well as architectural standpoint. The codes in consideration are linear block codes closely related to Reed Solomon (RS) codes. A high speed low complexity algorithm and architecture are presented for encoding and decoding RS codes based on evaluation. The implementation results show that the hardware resources and the total execution time are significantly reduced as compared to the classical decoder. The evaluation based encoding and decoding schemes are modified and extended for shortened RS codes and software implementation shows substantial reduction in memory footprint at the expense of latency. Hermitian codes can be seen as concatenated RS codes and are much longer than RS codes over the same aphabet. A fast, novel and efficient VLSI architecture for Hermitian codes is proposed based on interpolation decoding. The proposed architecture is proven to have better than Kötter’s decoder for high rate codes. The thesis work also explores a method of constructing optimal codes by computing the subfield subcodes of Generalized Toric (GT) codes that is a natural extension of RS codes over several dimensions. The polynomial generators or evaluation polynomials for subfield-subcodes of GT codes are identified based on which dimension and bound for the minimum distance are computed. The algebraic structure for the polynomials evaluating to subfield is used to simplify the list decoding algorithm for BCH codes. Finally, an efficient and novel approach is proposed for exploiting powerful codes having complex decoding but simple encoding scheme (comparable to RS codes) for multihop wireless sensor network (WSN) applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research aimed to investigate the main concern facing nurses in minimising risk within the perioperative setting and to generate an explanatory substantive theory of how they resolve this through anticipatory vigilance. In the context of the perioperative setting, nurses encounter challenges in minimising risks for their patients on a continuous basis. Current explanations of minimising risk in the perioperative setting offers insights into how perioperative nurses undertake their work. Currently research in minimising risk is broadly related to dealing with errors as opposed to preventing them. To date, little is known about how perioperative nurses practice and maintain safety. This study was guided by the principles of classic grounded theory as described by Glaser (1978, 1998, 2001). Data was collected through individual unstructured interviews with thirty seven perioperative nurses (with varying lengths of experiences of working in the area) and thirty three hours of non-participant observation within eight different perioperative settings in the Republic of Ireland. Data was simultaneously collected and analysed. The theory of anticipatory vigilance emerged as the pattern of behaviour through which nurse’s deal with their main concern of minimising risk in a high risk setting. Anticipatory vigilance is enacted through orchestrating, routinising and momentary adapting within a spirit of trusting relations within the substantive area of the perioperative setting. This theory of offers an explanation on how nurses resolve their main concern of minimising risk within the perioperative setting. The theory of anticipatory vigilance will be useful to nurses in providing a comprehensive framework of explanation and understanding on how nurses deal with minimising risk in the perioperative setting. The theory links perioperative nursing, risk and vigilance together. Clinical improvements through understanding and awareness of the theory of anticipatory vigilance will result in an improved quality environment, leading to safe patient outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article will explore the contribution made to the construction of discourse around religion outside of mainstream Christianity, at the turn of the twentieth century in Britain, by a Celticist movement as represented by Wellesley Tudor Pole (d.1968) and his connection to the Glastonbury phenomenon. I will detail the interconnectedness of individuals and movements occupying this discursive space and their interest in efforts to verify the authenticity of an artefact which Tudor Pole claimed was once in the possession of Jesus. Engagement with Tudor Pole’s quest to prove the provenance of the artefact, and his contention that a pre-Christian culture had existed in Ireland which had extended itself to Glastonbury and Iona creating the foundation for an authentic Western mystical tradition, is presented as one facet of a broader, contemporary discourse on alternative ideas and philosophies. In conclusion, I will juxtapose Tudor Pole’s fascination with Celtic origins and the approach of leading figures in the ‘Celtic Revival’ in Ireland, suggesting intersections and alterity in the construction of their worldview. The paper forms part of a chapter in a thesis under-preparation which examines the construction of discourse on religion outside of mainstream Christianity at the turn of the twentieth century, and in particular the role played by visiting religious reformers from Asia. The aim is to recover the (mostly forgotten) history of these engagements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oxidation-reduction (redox) potential is a fundamental physicochemical parameter that affects the growth of microorganisms in dairy products and contributes to a balanced flavour development in cheese. Even though redox potential has an important impact on the quality of dairy products, it is not usually monitored in dairy industry. The aims of this thesis were to develop practical methods for measuring redox potential in cheese, to provide detailed information on changes in redox potential during the cheesemaking and cheese ripening and how this parameter is influenced by starter systems and to understand the relationship between redox potential and cheese quality. Methods were developed for monitoring redox potential during cheesemaking and early in ripening. Changes in redox potential during laboratory scale manufacture of Cheddar, Gouda, Emmental, and Camembert cheeses were determined. Distinctive kinetics of reduction in redox potential during cheesemakings were observed, and depended on the cheese technology and starter culture utilised. Redox potential was also measured early in ripening by embedding electrodes into Cheddar cheese at moulding together with the salted curd pieces. Using this approach it was possible to monitor redox potential during the pressing stage. The redox potential of Emmental cheese was also monitored during ripening. Moreover, since bacterial growth drives the reduction in redox potential during cheese manufacture and ripening, the ability of Lactococcus lactis strains to affect redox potential was studied. Redox potential of a Cheddar cheese extract was altered by bacterial growth and there were strain-specific differences in the nature of the redox potential/time curves obtained. Besides, strategies to control redox potential during cheesemaking and ripening were developed. Oxidizing or reducing agents were added to the salted curd before pressing and results confirmed that a negative redox potential is essential for the development of sulfur compounds in Cheddar cheese. Overall, the studies described in this thesis gave an evidence of the importance of the redox potential on the quality of dairy products. Redox potential could become an additional parameter used to select microorganisms candidate as starters in fermented dairy products. Moreover, it has been demonstrated that the redox potential influences the development of flavour component. Thus, measuring continuously changes in redox potential of a product and controlling, and adjusting if necessary, the redox potential values during manufacture and ripening could be important in the future of the dairy industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: To examine whether smokers who reduce their quantity of cigarettes smoked between two periods are more or less likely to quit subsequently. STUDY DESIGN: Data come from the Health and Retirement Study, a nationally representative survey of older Americans aged 51-61 in 1991 followed every 2 years from 1992 to 1998. The 2064 participants smoking at baseline and the first follow-up comprise the main sample. MEASUREMENTS: Smoking cessation by 1996 is examined as the primary outcome. A secondary outcome is relapse by 1998. Spontaneous changes in smoking quantity between the first two waves make up the key predictor variables. Control variables include gender, age, education, race, marital status, alcohol use, psychiatric problems, acute or chronic health problems and smoking quantity. FINDINGS: Large (over 50%) and even moderate (25-50%) reductions in quantity smoked between 1992 and 1994 predict prospectively increased likelihood of cessation in 1996 compared to no change in quantity (OR 2.96, P<0.001 and OR 1.61, P<0.01, respectively). Additionally, those who reduced and then quit were somewhat less likely to relapse by 1998 than those who did not reduce in the 2 years prior to quitting. CONCLUSIONS: Reducing successfully the quantity of cigarettes smoked appears to have a beneficial effect on future cessation likelihood, even after controlling for initial smoking level and other variables known to impact smoking cessation. These results indicate that the harm reduction strategy of reduced smoking warrants further study.