957 resultados para proposal to state
Resumo:
Adult-onset urticaria pigmentosa/mastocytosis in the skin almost always persists throughout life. The prevalence of systemic mastocytosis in such patients is not precisely known. Bone marrow biopsies from 59 patients with mastocytosis in the skin and all available skin biopsies (n=27) were subjected to a meticulous cytological, histological, immunohistochemical, and molecular analysis for the presence of WHO-defined diagnostic criteria for systemic mastocytosis: compact mast cell infiltrates (major criterion); atypical mast cell morphology, KIT D816V, abnormal expression of CD25 by mast cells, and serum tryptase levels >20 ng/ml (minor criteria). Systemic mastocytosis is diagnosed when the major diagnostic criterion plus one minor criterion or at least three minor criteria are fulfilled. Systemic mastocytosis was confirmed in 57 patients (97%) by the diagnosis of compact mast cell infiltrates plus at least one minor diagnostic criterion (n=42, 71%) or at least three minor diagnostic criteria (n=15, 25%). In two patients, only two minor diagnostic criteria were detectable, insufficient for the diagnosis of systemic mastocytosis. By the use of highly sensitive molecular methods, including the analysis of microdissected mast cells, KIT D816V was found in all 58 bone marrow biopsies investigated for it but only in 74% (20/27) of the skin biopsies. It is important to state that even in cases with insufficient diagnostic criteria for systemic mastocytosis, KIT D816V-positive mast cells were detected in the bone marrow. This study demonstrates, for the first time, that almost all patients with adult-onset mastocytosis in the skin, in fact, have systemic mastocytosis with cutaneous involvement.
Resumo:
The Contested Floodplain tells the story of institutional changes in the management of common pool resources (pasture, wildlife, and fisheries) among Ila and Balundwe agro-pastoralists and Batwa fishermen in the Kafue Flats, in southern Zambia. It explains how and why a once rich floodplain area, managed under local common property regimes, becomes a poor man’s place and a degraded resource area. Based on social anthropological field research, the book explains how well working institutions in the past, regulating communal access to resources, have turned into state property and open access or privatization. The study focuses on the historic developments taking place since pre-colonial and colonial times up to today. Haller shows how the commons had been well regulated by local institutions in the past, often embedded in religious belief systems. He then explains the transformation from common property to state property since colonial times. When the state is unable to provide well-functioning institutions due to a lack in financial income, it contributes to de facto open access and degradation of the commons. The Zambian copper-based economy has faced crisis since 1975, and many Zambians have to look for economic alternatives and find ways to profit from the lack of state control (a paradox of the present-absent state). And while the state is absent, external actors use the ideology of citizenship to justify free use of resources during conflicts with local people. Also within Zambian communities, floodplain resources are highly contested, which is illustrated through conflicts over a proposed irrigation scheme in the area.
Resumo:
The European Commission’s proposals for the Legislative Framework of the Common Agricultural Policy (CAP) in the period 2014-2020 include, inter alia, the introduction of a “strong greening component”. For the first time, all EU farmers in receipt of support are to “go beyond the requirements of cross compliance and deliver environmental and climate benefits as part of their everyday activities crop diversification as a contribution to all EU farmers in receipt of support go beyond the requirements of cross compliance and deliver environmental and climate benefits as part of their everyday activities.” In a legal opinion prepared at the request of APRODEV, the Association of World Council of Churches related Development Organisations in Europe (www.aprodev.eu), Christian Häberli examines the WTO implications of this proposal, as compared with an alternative proposal to rather link direct payments to crop rotation. The conclusions are twofold: 1. Crop rotation is at least as likely to be found Green Box-compatible as crop diversification. Moreover, it will be more difficult to argue that crop diversification is “not more than minimally production-distorting” because it entails for most farmers less cost and work. 2. Even if (either of the two cropping schemes) were to be found “amber”, the EU would not have to relinquish this conditionality. This is because the direct payments involved would in all likelihood not, together with the other price support instruments, exceed the amount available under the presently scheduled maximum.
Resumo:
We review the failure of lowest order chiral SU(3)L ×SU(3)R perturbation theory χPT3 to account for amplitudes involving the f0(500) resonance and O(mK) extrapolations in momenta. We summarize our proposal to replace χPT3 with a new effective theory χPTσ based on a low-energy expansion about an infrared fixed point in 3-flavour QCD. At the fixed point, the quark condensate ⟨q̅q⟩vac ≠ 0 induces nine Nambu-Goldstone bosons: π,K,η and a QCD dilaton σ which we identify with the f0(500) resonance. We discuss the construction of the χPTσ Lagrangian and its implications for meson phenomenology at low-energies. Our main results include a simple explanation for the ΔI = 1/2 rule in K-decays and an estimate for the Drell-Yan ratio in the infrared limit.
Resumo:
R. G. Collingwood’s philosophical analysis of religious atonement as a dialectical process of mortal repentance and divine forgiveness is explained and criticized. Collingwood’s Christian concept of atonement, in which Christ TeX the Atonement (and also TeX the Incarnation), is subject in turn to another kind of dialectic, in which some of Collingwood’s leading ideas are first surveyed, and then tested against objections in a philosophical evaluation of their virtues and defects, strengths and weaknesses. Collingwood’s efforts to synthesize objective and subjective aspects of atonement, and his proposal to solve the soteriological problem as to why God becomes flesh, as a dogma of some Christian belief systems, is finally exposed in adversarial exposition as inadequately supported by one of his main arguments, designated here as Collingwood’s Dilemma. The dilemma is that sin is either forgiven or unforgiven by God. If God forgives sin, then God’s justice is lax, whereas if God does not forgive sin, then, also contrary to divine nature, God lacks perfect loving compassion. The dilemma is supposed to drive philosophy toward a concept of atonement in which the sacrifice of Christ is required in order to absolve God of the lax judgment objection. God forgives sin only when the price of sin is paid, in this case, by the suffering and crucifixion of God’s avatar. The dilemma can be resolved in another way than Collingwood considers, undermining his motivation for synthesizing objective and subjective facets of the concept of atonement for the sake of avoiding inconsistency. Collingwood is philosophically important because he asks all the right questions about religious atonement, and points toward reasonable answers, even if he does not always deliver original philosophically satisfactory solutions.
Resumo:
In this paper, we propose a new method for fully-automatic landmark detection and shape segmentation in X-ray images. To detect landmarks, we estimate the displacements from some randomly sampled image patches to the (unknown) landmark positions, and then we integrate these predictions via a voting scheme. Our key contribution is a new algorithm for estimating these displacements. Different from other methods where each image patch independently predicts its displacement, we jointly estimate the displacements from all patches together in a data driven way, by considering not only the training data but also geometric constraints on the test image. The displacements estimation is formulated as a convex optimization problem that can be solved efficiently. Finally, we use the sparse shape composition model as the a priori information to regularize the landmark positions and thus generate the segmented shape contour. We validate our method on X-ray image datasets of three different anatomical structures: complete femur, proximal femur and pelvis. Experiments show that our method is accurate and robust in landmark detection, and, combined with the shape model, gives a better or comparable performance in shape segmentation compared to state-of-the art methods. Finally, a preliminary study using CT data shows the extensibility of our method to 3D data.
Resumo:
This paper addresses the problem of fully-automatic localization and segmentation of 3D intervertebral discs (IVDs) from MR images. Our method contains two steps, where we first localize the center of each IVD, and then segment IVDs by classifying image pixels around each disc center as foreground (disc) or background. The disc localization is done by estimating the image displacements from a set of randomly sampled 3D image patches to the disc center. The image displacements are estimated by jointly optimizing the training and test displacement values in a data-driven way, where we take into consideration both the training data and the geometric constraint on the test image. After the disc centers are localized, we segment the discs by classifying image pixels around disc centers as background or foreground. The classification is done in a similar data-driven approach as we used for localization, but in this segmentation case we are aiming to estimate the foreground/background probability of each pixel instead of the image displacements. In addition, an extra neighborhood smooth constraint is introduced to enforce the local smoothness of the label field. Our method is validated on 3D T2-weighted turbo spin echo MR images of 35 patients from two different studies. Experiments show that compared to state of the art, our method achieves better or comparable results. Specifically, we achieve for localization a mean error of 1.6-2.0 mm, and for segmentation a mean Dice metric of 85%-88% and a mean surface distance of 1.3-1.4 mm.
Resumo:
This study explores the relationships between forest cover change and the village resettlement and land planning policies implemented in Laos, which have led to the relocation of remote and dispersed populations into clustered villages with easier access to state services and market facilities. We used the Global Forest Cover Change (2000–2012) and the most recent Lao Agricultural Census (2011) datasets to assess forest cover change in resettled and non-resettled villages throughout the country. We also reviewed a set of six case studies and performed an original case study in two villages of Luang Prabang province with 55 households, inquiring about relocation, land losses and intensification options. Our results show that resettled villages have greater baseline forest cover and total forest loss than most villages in Laos but not significant forest loss relative to that baseline. Resettled villages are consistently associated with forested areas, minority groups, and intermediate accessibility. The case studies highlight that resettlement coupled with land use planning does not necessarily lead to the abandonment of shifting cultivation or affect forest loss but lead to a re-spatialization of land use. This includes clustering of forest clearings, which might lead to fallow shortening and land degradation while limited intensification options exist in the resettled villages. This study provides a contribution to studying relationships between migration, forest cover change, livelihood strategies, land governance and agricultural practices in tropical forest environments.
Resumo:
Gaussian random field (GRF) conditional simulation is a key ingredient in many spatial statistics problems for computing Monte-Carlo estimators and quantifying uncertainties on non-linear functionals of GRFs conditional on data. Conditional simulations are known to often be computer intensive, especially when appealing to matrix decomposition approaches with a large number of simulation points. This work studies settings where conditioning observations are assimilated batch sequentially, with one point or a batch of points at each stage. Assuming that conditional simulations have been performed at a previous stage, the goal is to take advantage of already available sample paths and by-products to produce updated conditional simulations at mini- mal cost. Explicit formulae are provided, which allow updating an ensemble of sample paths conditioned on n ≥ 0 observations to an ensemble conditioned on n + q observations, for arbitrary q ≥ 1. Compared to direct approaches, the proposed formulae proveto substantially reduce computational complexity. Moreover, these formulae explicitly exhibit how the q new observations are updating the old sample paths. Detailed complexity calculations highlighting the benefits of this approach with respect to state-of-the-art algorithms are provided and are complemented by numerical experiments.
On degeneracy and invariances of random fields paths with applications in Gaussian process modelling
Resumo:
We study pathwise invariances and degeneracies of random fields with motivating applications in Gaussian process modelling. The key idea is that a number of structural properties one may wish to impose a priori on functions boil down to degeneracy properties under well-chosen linear operators. We first show in a second order set-up that almost sure degeneracy of random field paths under some class of linear operators defined in terms of signed measures can be controlled through the two first moments. A special focus is then put on the Gaussian case, where these results are revisited and extended to further linear operators thanks to state-of-the-art representations. Several degeneracy properties are tackled, including random fields with symmetric paths, centred paths, harmonic paths, or sparse paths. The proposed approach delivers a number of promising results and perspectives in Gaussian process modelling. In a first numerical experiment, it is shown that dedicated kernels can be used to infer an axis of symmetry. Our second numerical experiment deals with conditional simulations of a solution to the heat equation, and it is found that adapted kernels notably enable improved predictions of non-linear functionals of the field such as its maximum.
Resumo:
This work deals with parallel optimization of expensive objective functions which are modelled as sample realizations of Gaussian processes. The study is formalized as a Bayesian optimization problem, or continuous multi-armed bandit problem, where a batch of q > 0 arms is pulled in parallel at each iteration. Several algorithms have been developed for choosing batches by trading off exploitation and exploration. As of today, the maximum Expected Improvement (EI) and Upper Confidence Bound (UCB) selection rules appear as the most prominent approaches for batch selection. Here, we build upon recent work on the multipoint Expected Improvement criterion, for which an analytic expansion relying on Tallis’ formula was recently established. The computational burden of this selection rule being still an issue in application, we derive a closed-form expression for the gradient of the multipoint Expected Improvement, which aims at facilitating its maximization using gradient-based ascent algorithms. Substantial computational savings are shown in application. In addition, our algorithms are tested numerically and compared to state-of-the-art UCB-based batchsequential algorithms. Combining starting designs relying on UCB with gradient-based EI local optimization finally appears as a sound option for batch design in distributed Gaussian Process optimization.
Resumo:
The 1971 ruling of the California Supreme Court in the case of Serrano v. Priest initiated a chain of events that abruptly ended local financing of public schools in California. In seven short years, California transformed its school finance system from a decentralized one in which local communities chose how much to spend on their schools to a centralized one in which the state legislature determines the expenditures of every school district. This paper begins by describing California's school finance system before Serrano and the transformation from local to state finance. It then delineates some consequences of that transformation and draws lessons from California's experience with school finance reform.
Resumo:
Background. Pulsed-field gel electrophoresis (PFGE) is a laboratory technique in which Salmonella DNA banding patterns are used as molecular fingerprints for epidemiologic study for "PFGE clusters". State and national health departments (CDC) use PFGE to detect clusters of related cases and to discover common sources of bacteria in outbreaks. ^ Objectives. Using Houston Department of Health and Human Services (HDHHS) data, the study sought: (1) to describe the epidemiology of Salmonella in Houston, with PFGE subtype as a variable; and (2) to determine whether PFGE patterns and clusters detected in Houston were local appearances of PFGE patterns or clusters that occurred statewide. ^ Methods. During the years 2002 to 2005, the HDHHS collected and analyzed data from routine surveillance of Salmonella. We implemented a protocol, between May 1, 2007 and December 31, 2007, in which PFGE patterns from local cases were sent via e-mail to the Texas Department of State Health Services, to verify whether the local PFGE patterns were also part of statewide clusters. PFGE was performed from 106 patients providing a sample from which Salmonella was isolated in that time period. Local PFGE clusters were investigated, with the enhanced picture obtained by linking local PFGE patterns to PFGE patterns at the state and national level. ^ Results. We found that, during the years 2002 to 2005, there were 66 PFGE clusters, ranging in size from 2 to 22 patients within each cluster. Between different serotypes, there were marked differences in the sizes of PFGE clusters. A common source or risk factor was found in fewer than 5 of the 66 PFGE clusters. With the revised protocol, we found that 19 of 66 local PFGE patterns were indistinguishable from PFGE patterns at Texas DSHS. During the eight months, we identified ten local PFGE clusters with a total of 42 patients. The PFGE pattern for eight of the ten clusters matched the PFGE patterns for cases reported to Texas DSHS from other geographic areas. Five of the ten PFGE patterns matched PFGE patterns for clusters under investigation at PulseNet at the national level. HDHHS epidemiologists identified a mode of transmission in two of the ten local clusters and a common risk factor in a third local cluster. ^ Conclusion. In the extended-study protocol, Houston PFGE patterns were linked to patterns seen at the state and national level. The investigation of PFGE clusters was more efficacious in detecting a common transmission when local data were linked to state and national data. ^
Resumo:
Purpose: School districts in the U.S. regularly offer foods that compete with the USDA reimbursable meal, known as `a la carte' foods. These foods must adhere to state nutritional regulations; however, the implementation of these regulations often differs across districts. The purpose of this study is to compare two methods of offering a la carte foods on student's lunch intake: 1) an extensive a la carte program in which schools have a separate area for a la carte food sales, that includes non-reimbursable entrees; and 2) a moderate a la carte program, which offers the sale of a la carte foods on the same serving line with reimbursable meals. ^ Methods: Direct observation was used to assess children's lunch consumption in six schools, across two districts in Central Texas (n=373 observations). Schools were matched on socioeconomic status. Data collectors were randomly assigned to students, and recorded foods obtained, foods consumed, source of food, gender, grade, and ethnicity. Observations were entered into a nutrient database program, FIAS Millennium Edition, to obtain nutritional information. Differences in energy and nutrient intake across lunch sources and districts were assessed using ANOVA and independent t-tests. A linear regression model was applied to control for potential confounders. ^ Results: Students at schools with extensive a la carte programs consumed significantly more calories, carbohydrates, total fat, saturated fat, calcium, and sodium compared to students in schools with moderate a la carte offerings (p<.05). Students in the extensive a la carte program consumed approximately 94 calories more than students in the moderate a la carte program. There was no significant difference in the energy consumption in students who consumed any amount of a la carte compared to students who consumed none. In both districts, students who consumed a la carte offerings were more likely to consume sugar-sweetened beverages, sweets, chips, and pizza compared to students who consumed no a la carte foods. ^ Conclusion: The amount, type and method of offering a la carte foods can significantly affect student dietary intake. This pilot study indicates that when a la carte foods are more available, students consume more calories. Findings underscore the need for further investigation on how availability of a la carte foods affects children's diets. Guidelines for school a la carte offerings should be maximized to encourage the consumption of healthful foods and appropriate energy intake.^
Resumo:
PURPOSE: In United States, the percentage of Extremely Low Birth Weight (ELBW) born for year 2006 was 0.8% (approximately 32,000 babies) & Very Low Birth Weight (VLBW) 1.48% (1). ELBW babies account for nearly half (49%) of the infant mortality for United States. Very Low birth weight infants are at a significant risk for high mortality and morbidity due to their multi system involvement and predisposition to lung prematurity and impaired immune function. One of the common causes cited is Vitamin A deficiency (2, 3).The purpose of this study is to look at published literature on Vitamin A supplementation in very low birth weight (VLBW) infants. ^ RESEARCH DESIGN: Systematic review of literature of published articles meeting the pre-defined criteria. ^ PROCEDURE: Studies included in this review were those which looked at very low birth weight infants defined as birth weight<1500gms. All experimental studies were reviewed. Studies looking at the effect of Vitamin A supplementation in comparison with a placebo or by itself in varying dosing regimens as an intervention were reviewed. Vitamin A deficiency and its manifestations were of interest. We used key words such as "very low birth weight", "mortality", "Vitamin A", "retinol" and "supplementation" in our search. ^ RISKS & POTENTIAL BENEFITS: We do not see any potential risks associated with this study. The potential benefit is recommendation for future studies based on the review of literature available currently. ^ IMPORTANCE OF KNOWLEDGE THAT MAY REASONABLY BE EXPECTED TO RESULT: The systematic review of literature of all experimental studies in VLBW infants showed uniform correlation of parenteral Vitamin A dosing and high plasma concentrations achieved. The recommended dosage for use is 5000 IU 3 times/week given intramuscularly for 4 weeks to prevent CLD. Higher doses have not shown benefit, with a potential for toxicity, while lower doses are inadequate. There is no role of use of Vitamin A in closure of patent ductus arteriosus & reducing mortality. However, it is important to state that the number of studies done so far is limited with small sample sizes. There is a need in the future for experimental studies to ascertain the role of Vitamin A to improve outcomes in VLBW. Atleast, one more RCT should be conducted using the dosage recommended above to make this a standard practice.^