949 resultados para high-level features


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Germany is one of the iodine deficiency countries in Europe. Marine fish and its products can considerably contribute to the iodine supply via food. The short review gives an overview on the iodine content in fish and other marine species and discusses the influence of house hold cooking and other kitchen preparations. The iodine content in marine fish depends on the species and varies considerably at high level. Lean fish species have an average iodine content of more than 100 μg I / 100 g edible part. The recommended allowance for adults of dietary iodine of 180-200 μg can be covered by the consumption of one marine fish portion per day.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of synthetic and non-synthetic hormones have been reported in different regions with the recommendation of different doses. The adaptability of these findings have however not been very successful due to the high cost of building and maintaining hatchery, high cost of synthetic hormone (when available) and high level manpower required. It is obvious that adaptive research in the past ten years in developing countries like Nigeria have been geared towards utilization of resources that are equally effective but cheap and ready to come by. This paper reports the utilization of the pituitary extract of bull frog (Rana adspersa) and the toad (Bufo regularis) in the induced breeding of the African catfish, Clarias gariepinus. The extraction and dosage are discussed alongside the preliminary rearing of fries in outdoor hatchery tanks. Human chorionic gonadotrophin (HCG) and Clarias pituitary extracts were used as control

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A five months survey was conducted to identify the aquatic macrophytes in fishponds and reservoirs in Makurdi (Benue State, Nigeria) between August and December 1999. A total of 3-prominent aquatic macrophytes were identified: Ipomoea aquatica, Nymphae lotus and Echinochloa pyramidalis at two-study sites (site 1, receives organic manure effluent from a cattle ranch, site 2, receives inorganic fertilizer through application). Ipomoea aquatica were found restricted to site l, while Nymphae lotus and Echinochloa pyramidalis were found associated with site 2. Analysis of the results indicates high level of ammonia-nitrogen at site 1 compared to site 2. Mineral analysis of the plant tissues indicate high level of iron in Ipomoea aquatica and Nymphae lotus. Mineral concentration were found to be significantly higher (P,L, 0.05) in Ipomoea aquatica and Nymphae lotus when compared with concentration in Echinochloa pyramidalis

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A five months survey was conducted to identify the aquatic macrophytes in fishponds and reservoirs in Makurdi (Benue State, Nigeria) between August and December 1999. A total of 3-prominent aquatic macrophytes were identified: Ipomoea aquatica, Nymphae lotus and Echinochloa pyramidalis at two-study sites (site 1, receives organic manure effluent from a cattle ranch, site 2, receives inorganic fertilizer through application). Ipomoea aquatica were found restricted to site l, while Nymphae lotus and Echinochloa pyramidalis were found associated with site 2. Analysis of the results indicates high level of ammonia-nitrogen at site 1 compared to site 2. Mineral analysis of the plant tissues indicate high level of iron in Ipomoea aquatica and Nymphae lotus. Mineral concentration were found to be significantly higher (P,L, 0.05) in Ipomoea aquatica and Nymphae lotus when compared with concentration in Echinochloa pyramidalis

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Socioeconomic factors have long been incorporated into environmental research to examine the effects of human dimensions on coastal natural resources. Boyce (1994) proposed that inequality is a cause of environmental degradation and the Environmental Kuznets Curve is a proposed relationship that income or GDP per capita is related with initial increases in pollution followed by subsequent decreases (Torras and Boyce, 1998). To further examine this relationship within the CAMA counties, the emission of sulfur dioxide and nitrogen oxides, as measured by the EPA in terms of tons emitted, the Gini Coefficient, and income per capita were examined for the year of 1999. A quadratic regression was utilized and the results did not indicate that inequality, as measured by the Gini Coefficient, was significantly related to the level of criteria air pollutants within each county. Additionally, the results did not indicate the existence of the Environmental Kuznets Curve. Further analysis of spatial autocorrelation using ArcMap 9.2, found a high level of spatial autocorrelation among pollution emissions indicating that relation to other counties may be more important to the level of sulfur dioxide and nitrogen oxide emissions than income per capita and inequality. Lastly, the paper concludes that further Environmental Kuznets Curve and income inequality analyses in regards to air pollutant levels incorporate spatial patterns as well as other explanatory variables. (PDF contains 4 pages)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The dissertation is concerned with the mathematical study of various network problems. First, three real-world networks are considered: (i) the human brain network (ii) communication networks, (iii) electric power networks. Although these networks perform very different tasks, they share similar mathematical foundations. The high-level goal is to analyze and/or synthesis each of these systems from a “control and optimization” point of view. After studying these three real-world networks, two abstract network problems are also explored, which are motivated by power systems. The first one is “flow optimization over a flow network” and the second one is “nonlinear optimization over a generalized weighted graph”. The results derived in this dissertation are summarized below.

Brain Networks: Neuroimaging data reveals the coordinated activity of spatially distinct brain regions, which may be represented mathematically as a network of nodes (brain regions) and links (interdependencies). To obtain the brain connectivity network, the graphs associated with the correlation matrix and the inverse covariance matrix—describing marginal and conditional dependencies between brain regions—have been proposed in the literature. A question arises as to whether any of these graphs provides useful information about the brain connectivity. Due to the electrical properties of the brain, this problem will be investigated in the context of electrical circuits. First, we consider an electric circuit model and show that the inverse covariance matrix of the node voltages reveals the topology of the circuit. Second, we study the problem of finding the topology of the circuit based on only measurement. In this case, by assuming that the circuit is hidden inside a black box and only the nodal signals are available for measurement, the aim is to find the topology of the circuit when a limited number of samples are available. For this purpose, we deploy the graphical lasso technique to estimate a sparse inverse covariance matrix. It is shown that the graphical lasso may find most of the circuit topology if the exact covariance matrix is well-conditioned. However, it may fail to work well when this matrix is ill-conditioned. To deal with ill-conditioned matrices, we propose a small modification to the graphical lasso algorithm and demonstrate its performance. Finally, the technique developed in this work will be applied to the resting-state fMRI data of a number of healthy subjects.

Communication Networks: Congestion control techniques aim to adjust the transmission rates of competing users in the Internet in such a way that the network resources are shared efficiently. Despite the progress in the analysis and synthesis of the Internet congestion control, almost all existing fluid models of congestion control assume that every link in the path of a flow observes the original source rate. To address this issue, a more accurate model is derived in this work for the behavior of the network under an arbitrary congestion controller, which takes into account of the effect of buffering (queueing) on data flows. Using this model, it is proved that the well-known Internet congestion control algorithms may no longer be stable for the common pricing schemes, unless a sufficient condition is satisfied. It is also shown that these algorithms are guaranteed to be stable if a new pricing mechanism is used.

Electrical Power Networks: Optimal power flow (OPF) has been one of the most studied problems for power systems since its introduction by Carpentier in 1962. This problem is concerned with finding an optimal operating point of a power network minimizing the total power generation cost subject to network and physical constraints. It is well known that OPF is computationally hard to solve due to the nonlinear interrelation among the optimization variables. The objective is to identify a large class of networks over which every OPF problem can be solved in polynomial time. To this end, a convex relaxation is proposed, which solves the OPF problem exactly for every radial network and every meshed network with a sufficient number of phase shifters, provided power over-delivery is allowed. The concept of “power over-delivery” is equivalent to relaxing the power balance equations to inequality constraints.

Flow Networks: In this part of the dissertation, the minimum-cost flow problem over an arbitrary flow network is considered. In this problem, each node is associated with some possibly unknown injection, each line has two unknown flows at its ends related to each other via a nonlinear function, and all injections and flows need to satisfy certain box constraints. This problem, named generalized network flow (GNF), is highly non-convex due to its nonlinear equality constraints. Under the assumption of monotonicity and convexity of the flow and cost functions, a convex relaxation is proposed, which always finds the optimal injections. A primary application of this work is in the OPF problem. The results of this work on GNF prove that the relaxation on power balance equations (i.e., load over-delivery) is not needed in practice under a very mild angle assumption.

Generalized Weighted Graphs: Motivated by power optimizations, this part aims to find a global optimization technique for a nonlinear optimization defined over a generalized weighted graph. Every edge of this type of graph is associated with a weight set corresponding to the known parameters of the optimization (e.g., the coefficients). The motivation behind this problem is to investigate how the (hidden) structure of a given real/complex valued optimization makes the problem easy to solve, and indeed the generalized weighted graph is introduced to capture the structure of an optimization. Various sufficient conditions are derived, which relate the polynomial-time solvability of different classes of optimization problems to weak properties of the generalized weighted graph such as its topology and the sign definiteness of its weight sets. As an application, it is proved that a broad class of real and complex optimizations over power networks are polynomial-time solvable due to the passivity of transmission lines and transformers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cyber-physical systems integrate computation, networking, and physical processes. Substantial research challenges exist in the design and verification of such large-scale, distributed sensing, ac- tuation, and control systems. Rapidly improving technology and recent advances in control theory, networked systems, and computer science give us the opportunity to drastically improve our approach to integrated flow of information and cooperative behavior. Current systems rely on text-based spec- ifications and manual design. Using new technology advances, we can create easier, more efficient, and cheaper ways of developing these control systems. This thesis will focus on design considera- tions for system topologies, ways to formally and automatically specify requirements, and methods to synthesize reactive control protocols, all within the context of an aircraft electric power system as a representative application area.

This thesis consists of three complementary parts: synthesis, specification, and design. The first section focuses on the synthesis of central and distributed reactive controllers for an aircraft elec- tric power system. This approach incorporates methodologies from computer science and control. The resulting controllers are correct by construction with respect to system requirements, which are formulated using the specification language of linear temporal logic (LTL). The second section addresses how to formally specify requirements and introduces a domain-specific language for electric power systems. A software tool automatically converts high-level requirements into LTL and synthesizes a controller.

The final sections focus on design space exploration. A design methodology is proposed that uses mixed-integer linear programming to obtain candidate topologies, which are then used to synthesize controllers. The discrete-time control logic is then verified in real-time by two methods: hardware and simulation. Finally, the problem of partial observability and dynamic state estimation is ex- plored. Given a set placement of sensors on an electric power system, measurements from these sensors can be used in conjunction with control logic to infer the state of the system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Artisanal Fish Societies constitutes one of the poorest societies in the developing world. Attempts to harness the potentials of the societies have often failed due to the enormity of the problem of poverty. This study was conducted in four major fishing villages namely; Abule titun, Apojola, Imama Odo and Ibaro in order to investigate the occupational practices and the problems of rural artisanal fisherfolks in Oyam's Dam, area of Ogun State. Eighty respondents were randomly selected among the artisanal fisher folks for interview using interview guide. The findings revealed that 43.8% of the fisherfolks are within active range of 31-40 years while 30% are within 21-30 years range. Also 31% had no formal education indicating a relatively high level of illiteracy among the fisherfolks while majority of the respondents practice fishing activities using paddle and canoe. It was similarly discovered from the study that the most pressing problems of the fisherfolks is the lack of basic social amenities like electricity, potable water, access roads, hospitals and markets. It is therefore recommended that basic social infrastructures be provided for the artisanal fishing communities in order to improve their social welfare, standard of living and the capacity to have a sustainable fishing occupation in the interest of food security and poverty alleviation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Artisanal Fish Societies constitutes one of the poorest societies in the developing world. Attempts to harness the potentials of such societies have often failed due to the enormity of the problem of poverty. This study was conducted in four major fishing villages namely: Abule Titun, Apojola, Imala Odo and Ibaro in order to investigate the occupational practices and the problems of rural artisanal fisherfolks in Oyam's Dam, area of Ogun State. Eighty respondents were randomly selected among the artisanal fisher folks for interview using interview guide. The findings revealed that 43.8% of the fisherfolks are within active age range of 31-40 years while 30% are within 21-30 years range. Also 31% had no formal education indicating a relatively high level of illiteracy among the fisherfolks while majority of the respondents practice fishing activities using paddle and canoe. It was similarly discovered from the study that the most pressing problems of the fishfolks is the lack of basic social amenities like electricity, potable water, access roads, hospital and markets. It is therefore recommended that basic social infrastructures be provided for the artisanal fishing communities in order to improve their social welfare, standard of living and the capacity to have a sustainable fishing occupation in the interest of food security and poverty alleviation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

EUSKARA LABURPENA: Irakurketa bizitza osoan zehar gizakiak jorratu beharko duen prozesu konplexua da. Irakurketarekiko lehen harremanak txikitatik eta familia giroan hasten direla kontuan izanik, hizkuntza idatziaren sustatze egitasmoan familiak duen ezinbesteko papera gailendu nahi izan dugu lan honetan. Horrenbestez, Lekeitioko 3-5 urte bitarteko haurren familiek etxetik irakurzaletasuna sustatzeko egiten duten ahalegina aztertzeko, galdetegi bidezko ikerketa bat egin dugu. Burututako ikerlanetik ondorio interesgarriak ateratzea lortu dugu; besteak beste, etxean gurasoek haurrekin irakurtzeko ohitura erakutsi arren, liburutegira edota ekimenetara joateko ez dutela esfortzu handiegirik egiten; edota familiek ez dutela oparien rankingeko gorengo mailan kokatzen liburua.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An analytical fluid model for JxB heating during the normal incidence by a short ultraintense linearly polarized laser on a solid-density plasma is proposed. The steepening of an originally smooth electron density profile as the electrons are pushed inward by the laser is included self-consistently. It is shown that the JxB heating includes two distinct coupling processes depending on the initial laser and plasma conditions: for a moderate intensity (a <= 1), the ponderomotive force of the laser light can drive a large plasma wave at the point n(e)=4 gamma(0)n(c) resonantly. When this plasma wave is damped, the energy is transferred to the plasma. At higher intensity, the electron density is steepened to a high level by the time-independent ponderomotive force, n(e)> 4 gamma(0)n(c), so that no 2 omega resonance will occur, but the longitudinal component of the oscillating ponderomotive field can lead to an absorption mechanism similar to "vacuum heating." (c) 2006 American Institute of Physics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern robots are increasingly expected to function in uncertain and dynamically challenging environments, often in proximity with humans. In addition, wide scale adoption of robots requires on-the-fly adaptability of software for diverse application. These requirements strongly suggest the need to adopt formal representations of high level goals and safety specifications, especially as temporal logic formulas. This approach allows for the use of formal verification techniques for controller synthesis that can give guarantees for safety and performance. Robots operating in unstructured environments also face limited sensing capability. Correctly inferring a robot's progress toward high level goal can be challenging.

This thesis develops new algorithms for synthesizing discrete controllers in partially known environments under specifications represented as linear temporal logic (LTL) formulas. It is inspired by recent developments in finite abstraction techniques for hybrid systems and motion planning problems. The robot and its environment is assumed to have a finite abstraction as a Partially Observable Markov Decision Process (POMDP), which is a powerful model class capable of representing a wide variety of problems. However, synthesizing controllers that satisfy LTL goals over POMDPs is a challenging problem which has received only limited attention.

This thesis proposes tractable, approximate algorithms for the control synthesis problem using Finite State Controllers (FSCs). The use of FSCs to control finite POMDPs allows for the closed system to be analyzed as finite global Markov chain. The thesis explicitly shows how transient and steady state behavior of the global Markov chains can be related to two different criteria with respect to satisfaction of LTL formulas. First, the maximization of the probability of LTL satisfaction is related to an optimization problem over a parametrization of the FSC. Analytic computation of gradients are derived which allows the use of first order optimization techniques.

The second criterion encourages rapid and frequent visits to a restricted set of states over infinite executions. It is formulated as a constrained optimization problem with a discounted long term reward objective by the novel utilization of a fundamental equation for Markov chains - the Poisson equation. A new constrained policy iteration technique is proposed to solve the resulting dynamic program, which also provides a way to escape local maxima.

The algorithms proposed in the thesis are applied to the task planning and execution challenges faced during the DARPA Autonomous Robotic Manipulation - Software challenge.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[ES]La caracterización térmica de una fachada vegetal es una tarea difícil que requiere un nivel de certeza y predicción realista de modelos en situaciones exteriores dinámicas. El estudio teórico de elementos constructivos complejos no asemeja la realidad, por lo que para obtener la correcta caracterización es necesario ensayar dichos elementos y analizar los datos obtenidos. Para ello se utilizan las células de ensayo PASLINK y el entorno informático LORD. A través de ellos, se obtiene la transmitancia térmica dinámica de la fachada vegetal ensayada en condiciones exteriores reales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

I. Foehn winds of southern California.
An investigation of the hot, dry and dust laden winds occurring in the late fall and early winter in the Los Angeles Basin and attributed in the past to the influences of the desert regions to the north revealed that these currents were of a foehn nature. Their properties were found to be entirely due to dynamical heating produced in the descent from the high level areas in the interior to the lower Los Angeles Basin. Any dust associated with the phenomenon was found to be acquired from the Los Angeles area rather than transported from the desert. It was found that the frequency of occurrence of a mild type foehn of this nature during this season was sufficient to warrant its classification as a winter monsoon. This results from the topography of the Los Angeles region which allows an easy entrance to the air from the interior by virtue of the low level mountain passes north of the area. This monsoon provides the mild winter climate of southern California since temperatures associated with the foehn currents are far higher than those experienced when maritime air from the adjacent Pacific Ocean occupies the region.

II. Foehn wind cyclo-genesis.
Intense anticyclones frequently build up over the high level regions of the Great Basin and Columbia Plateau which lie between the Sierra Nevada and Cascade Mountains to the west and the Rocky Mountains to the east. The outflow from these anticyclones produce extensive foehns east of the Rockies in the comparatively low level areas of the middle west and the Canadian provinces of Alberta and Saskatchewan. Normally at this season of the year very cold polar continental air masses are present over this territory and with the occurrence of these foehns marked discontinuity surfaces arise between the warm foehn current, which is obliged to slide over a colder mass, and the Pc air to the east. Cyclones are easily produced from this phenomenon and take the form of unstable waves which propagate along the discontinuity surface between the two dissimilar masses. A continual series of such cyclones was found to occur as long as the Great Basin anticyclone is maintained with undiminished intensity.

III. Weather conditions associated with the Akron disaster.
This situation illustrates the speedy development and propagation of young disturbances in the eastern United States during the spring of the year under the influence of the conditionally unstable tropical maritime air masses which characterise the region. It also furnishes an excellent example of the superiority of air mass and frontal methods of weather prediction for aircraft operation over the older methods based upon pressure distribution.

IV. The Los Angeles storm of December 30, 1933 to January 1, 1934.
This discussion points out some of the fundamental interactions occurring between air masses of the North Pacific Ocean in connection with Pacific Coast storms and the value of topographic and aerological considerations in predicting them. Estimates of rainfall intensity and duration from analyses of this type may be made and would prove very valuable in the Los Angeles area in connection with flood control problems.