868 resultados para Displaced homemakers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The frequency of morphological abnormalities in neuronal perikarya which were in contact with diffuse beta-amyloid (Abeta) deposits in patients with Alzheimer’s disease (AD) was compared with neurons located adjacent to the deposits. Morphological abnormalities were also studied in elderly, non-demented (ND) cases with and without diffuse Abeta deposits. In AD and ND cases with Abeta deposits, an increased proportion of neurons in contact with diffuse deposits exhibited at least one abnormality compared with neurons located adjacent to the deposits. Neurons in contact with diffuse deposits exhibited a greater frequency of abnormalities of shape, nuclei, nissl substance and had a higher frequency of cytoplasmic vacuoles compared with adjacent neurons. A greater frequency of abnormalities of shape, nissl substance and in the frequency of displaced nuclei were also observed in neurons adjacent to diffuse deposits in AD compared with ND cases. With the exception of absent nuclei, morphological abnormalities adjacent to diffuse deposits in ND cases were similar to those of ND cases without Abeta deposits. These results suggest that neuronal degeneration is associated with the earliest stages of Abeta deposit formation and is not specifically related to the formation of mature senile plaques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature relating to the performance of pulsed sieve plate liquid-liquid extraction columns and the relevant hydrodynamic phenomenon have been surveyed. Hydrodynamic behaviour and mass transfer characteristics of droplets in turbulent and non-turbulent conditions have also been reviewed. Hydrodynamic behaviour, i.e. terminal and characteristic velocity of droplets, droplet size and droplet breakup processes, and mass transfer characteristics of single droplets (d≤0.6 cm) were investigated under pulsed (mixer-settler & transitional regimes) and non-pulsed conditions in a 5.0 cm diameter, 100 cm high, pulsed sieve plate column with three different sieve plate types and variable plate spacing. The system used was toluene (displaced) - acetone - distilled water. Existing photographic techniques for following and recording the droplet behaviour, and for observing the parameters of the pulse and the pulse shape were further developed and improved. A unique illumination technique was developed by which a moving droplet could be photographed using cine or video photography with good contrast without using any dye. Droplet size from a given nozzle and droplet velocity for a given droplet diameter are reduced under pulsing condition, and it was noted that this effect is enhanced in the presence of sieve plate. The droplet breakup processes are well explained by reference to an impact-breakup mechanism. New correlations to predict droplet diameter based on this mechanism are given below.vskip 1.0cm or in dimensionless groups as follows:- (We)crit= 3.12 - 1.79 (Eo)crit A correlation based on the isotropic turbulence theory was developed to calculate droplet diameter in the emulsion regime.vskip 1.0cm Experimental results show that in the mixer-settler and transitional regimes, pulsing parameters had little effect on the overall dispersed phase mass transfer coefficient during the droplet formation and unhindered travel periods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A review of the general chromatographic theory and of continuous chromatographic techniques has been carried out. Three methods of inversion of sucrose to glucose and fructose in beet molasses were explored. These methods were the inversion of sucrose using the enzyme invertase, by the use of hydrochloric acid and the use of the resin Amberlite IR118 in the H+ form. The preferred method on economic and purity considerations was by the use of the enzyme invertase. The continuous chromatographic separation of inverted beet molasses resulting in a fructose rich product and a product containing glucose and other non-sugars was carried out using a semi-continuous counter-current chromatographic refiner (SCCR6), consisting of ten 10.8cm x 75cm long stainless steel columns packed with a calcium charged 8% cross-linked polystyrene resin Zerolit SRC 14. Based on the literature this is the first time such a continuous separation has been attempted. It was found that the cations present in beet molasses displaced the calcium ions from the resin resulting in poor separation of the glucose and fructose. Three methods of maintaining the calcium form of the resin during the continuous operation of the equipment were established. Passing a solution of calcium nitrate through the purge column for half a switch period was found to be most effective as there was no contamination of the main fructose rich product and the product concentrations were increased by 50%. When a 53% total solids (53 Brix) molasses feedstock was used, the throughput was 34.13kg sugar solids per m3 of resin per hour. Product purities of 97% fructose in fructose rich (FRP) and 96% glucose in the glucose rich (GRP) products were obtained with product concentrations of 10.93 %w/w for the FRP and 10.07 %w/w for the GRP. The effects of flowrates, temperature and background sugar concentration on the distribution coefficients of fructose, glucose, betaine and an ionic component of beet molasses were evaluated and general relationships derived. The computer simulation of inverted beet molasses separations on an SCCR system has been carried out successfully.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis presents an experimentally validated modelling study of the flow of combustion air in an industrial radiant tube burner (RTB). The RTB is used typically in industrial heat treating furnaces. The work has been initiated because of the need for improvements in burner lifetime and performance which are related to the fluid mechanics of the com busting flow, and a fundamental understanding of this is therefore necessary. To achieve this, a detailed three-dimensional Computational Fluid Dynamics (CFD) model has been used, validated with experimental air flow, temperature and flue gas measurements. Initially, the work programme is presented and the theory behind RTB design and operation in addition to the theory behind swirling flows and methane combustion. NOx reduction techniques are discussed and numerical modelling of combusting flows is detailed in this section. The importance of turbulence, radiation and combustion modelling is highlighted, as well as the numerical schemes that incorporate discretization, finite volume theory and convergence. The study first focuses on the combustion air flow and its delivery to the combustion zone. An isothermal computational model was developed to allow the examination of the flow characteristics as it enters the burner and progresses through the various sections prior to the discharge face in the combustion area. Important features identified include the air recuperator swirler coil, the step ring, the primary/secondary air splitting flame tube and the fuel nozzle. It was revealed that the effectiveness of the air recuperator swirler is significantly compromised by the need for a generous assembly tolerance. Also, there is a substantial circumferential flow maldistribution introduced by the swirier, but that this is effectively removed by the positioning of a ring constriction in the downstream passage. Computations using the k-ε turbulence model show good agreement with experimentally measured velocity profiles in the combustion zone and proved the use of the modelling strategy prior to the combustion study. Reasonable mesh independence was obtained with 200,000 nodes. Agreement was poorer with the RNG  k-ε and Reynolds Stress models. The study continues to address the combustion process itself and the heat transfer process internal to the RTB. A series of combustion and radiation model configurations were developed and the optimum combination of the Eddy Dissipation (ED) combustion model and the Discrete Transfer (DT) radiation model was used successfully to validate a burner experimental test. The previously cold flow validated k-ε turbulence model was used and reasonable mesh independence was obtained with 300,000 nodes. The combination showed good agreement with temperature measurements in the inner and outer walls of the burner, as well as with flue gas composition measured at the exhaust. The inner tube wall temperature predictions validated the experimental measurements in the largest portion of the thermocouple locations, highlighting a small flame bias to one side, although the model slightly over predicts the temperatures towards the downstream end of the inner tube. NOx emissions were initially over predicted, however, the use of a combustion flame temperature limiting subroutine allowed convergence to the experimental value of 451 ppmv. With the validated model, the effectiveness of certain RTB features identified previously is analysed, and an analysis of the energy transfers throughout the burner is presented, to identify the dominant mechanisms in each region. The optimum turbulence-combustion-radiation model selection was then the baseline for further model development. One of these models, an eccentrically positioned flame tube model highlights the failure mode of the RTB during long term operation. Other models were developed to address NOx reduction and improvement of the flame profile in the burner combustion zone. These included a modified fuel nozzle design, with 12 circular section fuel ports, which demonstrates a longer and more symmetric flame, although with limited success in NOx reduction. In addition, a zero bypass swirler coil model was developed that highlights the effect of the stronger swirling combustion flow. A reduced diameter and a 20 mm forward displaced flame tube model shows limited success in NOx reduction; although the latter demonstrated improvements in the discharge face heat distribution and improvements in the flame symmetry. Finally, Flue Gas Recirculation (FGR) modelling attempts indicate the difficulty of the application of this NOx reduction technique in the Wellman RTB. Recommendations for further work are made that include design mitigations for the fuel nozzle and further burner modelling is suggested to improve computational validation. The introduction of fuel staging is proposed, as well as a modification in the inner tube to enhance the effect of FGR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The crystal structure of natural magnetite has been investigated on the basis of previously published X-ray intensity data and a newly acquired, more extensive data base. Both investigations show that the structure does not conform to the centrosymmetrical space group Fd3m, as is normally assumed, but the non-centrosymmetrical space group F43m. The structure refinement provides values for the atom positions, anisotropic thermal parameters and bond lengths. A study of Friedel related pairs of X-ray intensities shows that Friedel's law is violated in magnetite, further confirming that the space group is non-centrosymmetrical. It was found that the octahedral site cations in magnetite do not occupy special positions at the centres of the octahedral interstices as they should under the space group Fd3m, but are displaced along <111 > directions leading to F43m symmetry. A mechanism is known for the origin of these displacements and the likelihood of similar displacements occurring in other natural and synthetic spinels is discussed. The crystal structure of a natural titanomaghemite was determined by a combination of X-ray diffraction and Mõssbauer spectroscopy. This was confirmed as possessing a primitive cubic Bravais lattice with the space group P4332 and the structural formula: Fe3+.0.96 0 0.04 [Fe2+0.23 Fe3+0.99 Ti4+0.42 0 0.37 ] 042 - where 0 represents a cation vacancy. As the above formula shows, there are cation vacancies on both tetrahedral arrl octahedral sites, the majority being restricted to octahedral sltes. No tetrahedral site Fe2+ or Ti4+ was observed. Values for the atom positions, anisotropic thermal parameters and bond lengths have been determined for this particular specimen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is known that parallel pathways exist within the visual system. These have been described as magnocellular and parvocellular as a result of the layered organisation of the lateral geniculate nucleus and extend from the retina to the cortex. Dopamine (DA) and acetylcholine (ACH) are neurotransmitters that are present in the visual pathway. DA is present in the retina and is associated with the interplexiform cells and horizontal cells. ACH is also present in the retina and is associated with displaced amacrine cells; it is also present in the superior colliculus. DA is found to be significantly depleted in the brain of Parkinson's disease (PD) patients and ACH in Alzheimer's disease (AD) patients. For this reason these diseases were used to assess the function of DA and ACH in the electrophysiology of the visual pathway. Experiments were conducted on young normals to design stimuli that would preferentially activate the magnocellular or parvocellular pathway. These stimuli were then used to evoke visual evoked potentials (VEP) in patients with PD and AD, in order to assess the function of DA and ACH in the visual pathway. Electroretinograms (ERGs) were also measured in PD patients to assess the role of DA in the retina. In addition, peripheral ACH function was assessed by measuring VEPs, ERGs and contrast sensitivity (CS) in young normals following the topical instillation of hyoscine hydrobromide (an anticholinergic drug). The results indicate that the magnocellular pathway can be divided into two: a cholinergic tectal-association area pathway carrying luminance information, and a non-cholinergic geniculo-cortical pathway carrying spatial information. It was also found that depletion of DA had very little effect on the VEPs or ERGs, confirming a general regulatory function for this neurotransmitter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mistuning a harmonic produces an exaggerated change in its pitch, a component-pitch shift. The origin of these pitch shifts was explored by manipulations intended to alter the grouping status of a mistuned target component in a periodic complex tone. In experiment 1, which used diotic presentation, reinstating the corresponding harmonic (in-tune counterpart) caused the pitch shifts on the mistuned target largely to disappear for components 3 and 4, although they remained for component 2. A computational model of component-pitch shifts, based on harmonic cancellation, was unable to explain the near-complete loss of pitch shifts when the counterpart was present; only small changes occurred. In experiment 2, the complex tone and mistuned component 4 were presented in the left ear and the in-tune counterpart was presented in the right. The in-tune counterpart again reduced component-pitch shifts, but they were restored when a captor complex into which the counterpart fitted as harmonic 3 was added in the right ear; presumably by providing an alternative grouping possibility for the counterpart. It is proposed that component-pitch shifts occur only if the mistuned component is selected to contribute to the complex-tone percept; these shifts are eliminated if it is displaced by a better candidate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Government agencies use information technology extensively to collect business data for regulatory purposes. Data communication standards form part of the infrastructure with which businesses must conform to survive. We examine the development of, and emerging competition between, two open business reporting data standards adopted by government bodies in France; EDIFACT (incumbent) and XBRL (challenger). The research explores whether an incumbent may be displaced in a setting in which the contention is unresolved. We apply Latour’s (1992) translation map to trace the enrolments and detours in the battle. We find that regulators play an important role as allies in the development of the standards. The antecedent networks in which the standards are located embed strong beliefs that become barriers to collaboration and fuel the battle. One of the key differentiating attitudes is whether speed is more important than legitimacy. The failure of collaboration encourages competition. The newness of XBRL’s technology just as regulators need to respond to an economic crisis and its adoption by French regulators not using EDIFACT create an opportunity for the challenger to make significant network gains over the longer term. ANT also highlights the importance of the preservation of key components of EDIFACT in ebXML.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many of the Statnotes described in this series, the statistical tests assume the data are a random sample from a normal distribution These Statnotes include most of the familiar statistical tests such as the ‘t’ test, analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’). Nevertheless, many variables exhibit a more or less ‘skewed’ distribution. A skewed distribution is asymmetrical and the mean is displaced either to the left (positive skew) or to the right (negative skew). If the mean of the distribution is low, the degree of variation large, and when values can only be positive, a positively skewed distribution is usually the result. Many distributions have potentially a low mean and high variance including that of the abundance of bacterial species on plants, the latent period of an infectious disease, and the sensitivity of certain fungi to fungicides. These positively skewed distributions are often fitted successfully by a variant of the normal distribution called the log-normal distribution. This statnote describes fitting the log-normal distribution with reference to two scenarios: (1) the frequency distribution of bacterial numbers isolated from cloths in a domestic environment and (2), the sizes of lichenised ‘areolae’ growing on the hypothalus of Rhizocarpon geographicum (L.) DC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A study of the influence of macroscopic quenching stresses on long fatigue crack growth in an aluminium alloy-SiC composite has been made. Direct comparison between quenched plate, where high residual stresses are present, and quenched and stretched plate, where they have been eliminated, has highlighted their rôle in crack closure. Despite similar strength levels and identical crack growth mechanisms, the stretched composite displays faster crack growth rates over the complete range of ΔK, measured at R = 0.1, with threshold being displaced to a lower nominal ΔK value. Closure levels are dependent upon crack length, but are greater in the unstretched composite, due to the effect of surface compressive stresses acting to close the crack tip. These result in lower values of ΔKeff in the unstretched material, explaining the slower crack growth rates. Effective ΔKth values are measured at 1.7 MPa√m, confirmed by constant Kmax testing. In the absence of residual stress, closure levels of approximately 2.5 MPa√m are measured and this is attributed to a roughness mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

N-Heterocyclic carbene coated Au and Pd nanoparticles have been prepared by a ligand exchange reaction; although carbenes quantitatively displaced the thioether and phosphine ligands from the nanoparticle surface, the resultant nanoparticles spontaneously leached metal complexes and aggregated in solution. © 2009 The Royal Society of Chemistry and the Centre National de la Recherche Scientifique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In response to a crime epidemic afflicting Latin America since the early 1990s, several countries in the region have resorted to using heavy-force police or military units to physically retake territories de facto controlled by non-State criminal or insurgent groups. After a period of territory control, the heavy forces hand law enforcement functions in the retaken territories to regular police officers, with the hope that the territories and their populations will remain under the control of the state. To a varying degree, intensity, and consistency, Brazil, Colombia, Mexico, and Jamaica have adopted such policies since the mid-1990s. During such operations, governments need to pursue two interrelated objectives: to better establish the state’s physical presence and to realign the allegiance of the population in those areas toward the state and away from the non-State criminal entities. From the perspective of law enforcement, such operations entail several critical decisions and junctions, such as: Whether or not to announce the force insertion in advance. The decision trades off the element of surprise and the ability to capture key leaders of the criminal organizations against the ability to minimize civilian casualties and force levels. The latter, however, may allow criminals to go to ground and escape capture. Governments thus must decide whether they merely seek to displace criminal groups to other areas or maximize their decapitation capacity. Intelligence flows rarely come from the population. Often, rival criminal groups are the best source of intelligence. However, cooperation between the State and such groups that goes beyond using vetted intelligence provided by the groups, such as a State tolerance for militias, compromises the rule-of-law integrity of the State and ultimately can eviscerate even public safety gains. Sustaining security after initial clearing operations is at times even more challenging than conducting the initial operations. Although unlike the heavy forces, traditional police forces, especially if designed as community police, have the capacity to develop trust of the community and ultimately focus on crime prevention, developing such trust often takes a long time. To develop the community’s trust, regular police forces need to conduct frequent on-foot patrols with intensive nonthreatening interactions with the population and minimize the use of force. Moreover, sufficiently robust patrol units need to be placed in designated beats for substantial amount of time, often at least over a year. Establishing oversight mechanisms, including joint police-citizens’ boards, further facilities building trust in the police among the community. After disruption of the established criminal order, street crime often significantly rises and both the heavy-force and community-police units often struggle to contain it. The increase in street crime alienates the population of the retaken territory from the State. Thus developing a capacity to address street crime is critical. Moreover, the community police units tend to be vulnerable (especially initially) to efforts by displaced criminals to reoccupy the cleared territories. Losing a cleared territory back to criminal groups is extremely costly in terms of losing any established trust and being able to recover it. Rather than operating on a priori determined handover schedule, a careful assessment of the relative strength of regular police and criminal groups post-clearing operations is likely to be a better guide for timing the handover from heavy forces to regular police units. Cleared territories often experience not only a peace dividend, but also a peace deficit – in the rise new serious crime (in addition to street crime). Newly – valuable land and other previously-inaccessible resources can lead to land speculation and forced displacement; various other forms of new crime can also significantly rise. Community police forces often struggle to cope with such crime, especially as it is frequently linked to legal business. Such new crime often receives little to no attention in the design of the operations to retake territories from criminal groups. But without developing an effective response to such new crime, the public safety gains of the clearing operations can be altogether lost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

World War II profoundly impacted Florida. The military geography of the State is essential to an understanding the war. The geostrategic concerns of place and space determined that Florida would become a statewide military base. Florida's attributes of place such as climate and topography determined its use as a military academy hosting over two million soldiers, nearly 15 percent of the GI Army, the largest force the US ever raised. One-in-eight Floridians went into uniform. Equally, Florida's space on the planet made it central for both defensive and offensive strategies. The Second World War was a war of movement, and Florida was a major jump off point for US force projection world-wide, especially of air power. Florida's demography facilitated its use as a base camp for the assembly and engagement of this military power. In 1940, less than two percent of the US population lived in Florida, a quiet, barely populated backwater of the United States. But owing to its critical place and space, over the next few years it became a 65,000 square mile training ground, supply dump, and embarkation site vital to the US war effort. Because of its place astride some of the most important sea lanes in the Atlantic World, Florida was the scene of one of the few Western Hemisphere battles of the war. The militarization of Florida began long before Pearl Harbor. The pre-war buildup conformed to the US strategy of the war. The strategy of theUS was then (and remains today) one of forward defense: harden the frontier, then take the battle to the enemy, rather than fight them in North America. The policy of "Europe First," focused the main US war effort on the defeat of Hitler's Germany, evaluated to be the most dangerous enemy. In Florida were established the military forces requiring the longest time to develop, and most needed to defeat the Axis. Those were a naval aviation force for sea-borne hostilities, a heavy bombing force for reducing enemy industrial states, and an aerial logistics train for overseas supply of expeditionary campaigns. The unique Florida coastline made possible the seaborne invasion training demanded for US victory. The civilian population was employed assembling mass-produced first-generation container ships, while Floridahosted casualties, Prisoners-of-War, and transient personnel moving between the Atlantic and Pacific. By the end of hostilities and the lifting of Unlimited Emergency, officially on December 31, 1946, Floridahad become a transportation nexus. Florida accommodated a return of demobilized soldiers, a migration of displaced persons, and evolved into a modern veterans' colonia. It was instrumental in fashioning the modern US military, while remaining a center of the active National Defense establishment. Those are the themes of this work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large amounts of the greenhouse gas methane are released from the seabed to the water column where it may be consumed by aerobic methanotrophic bacteria. This microbial filter is consequently the last marine sink for methane before its liberation to the atmosphere. The size and activity of methanotrophic communities, which determine the capacity of the water column methane filter, are thought to be mainly controlled by nutrient and redox dynamics, but little is known about the effects of ocean currents. Here, we report measurements of methanotrophic activity and biomass (CARD-FISH) at methane seeps west of Svalbard, and related them to physical water mass properties (CTD) and modelled current dynamics. We show that cold bottom water containing a large number of aerobic methanotrophs was rapidly displaced by warmer water with a considerably smaller methanotrophic community. This water mass exchange, caused by short-term variations of the West Spitsbergen Current, constitutes a rapid oceanographic switch severely reducing methanotrophic activity in the water column. Strong and fluctuating currents are widespread oceanographic features common at many methane seep systems and are thus likely to globally affect methane oxidation in the ocean water column.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The major topographic features, or provinces, beyond the continental slope off the Atlantic coast of the United States are (1) Sohm Plain, (2) Hatteras Plain, (3) Nares Plain, (4) Blake Basin, (5) Blake Plateau-Bahama Banks, and (6) Bermuda Rise. The whole of the described area is commonly referred to as the North American Basin. This basin is bounded on the north by Newfoundland Ridge and on the south by Puerto Rico Trench. Topographic features of note within the basin are the divide and the area of depressions between Sohm and Hatteras Plains, the sharply crested Blake Ridge, and the Puerto Rico Ridge. Recently accumulated data on deep-sea oores has given good evidence that the silt and sand covering the abyssal plains are displaced continental sediments in a virtually quartz-free oceanic environment. These sediments were deposited on a primary volcanic bottom. The primary or volcanic bottom is characterized by abyssal hills and seamounts, and the sediment bottom is characterized by abyssal plains, which extend seaward from the continental margins. On the Blake Plateau, bottom photographs and dredge hauls in the axis of the stream show that locally sediment has been removed and the bottom is paved with crusts and nodules of manganese. Photographs and dredged samples from the outer part of the New England Seamount, Chain and Caryn Peak also indicate extensive encrustations of manganese oxide which acts as a binding agent in areas of ooze or other organic debris and thus helps to stabilize the bottom.