826 resultados para Effects-Based Approach to Operations
Resumo:
We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function m(ω) only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter α in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of P[ρ|D] in the full Nω » Nτ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width delta peaks and more realistic scenarios, based on the perturbative Euclidean Wilson Loop as well as the Wilson Line correlator in Coulomb gauge.
Resumo:
Research demonstrates that social preferences are characterized by significant individual differences. An important question, often overlooked, is from where do these individual differences originate? And what are the processes that underlie such differences? In this paper, we outline the neural trait approach to uncovering sources of individual differences in social preferences, particularly as evidenced in economic games. We focus on two primary methods—resting-state electroencephalography and structural magnetic resonance imaging—used by researchers to quantify task-independent, brain-based characteristics that are stable over time. We review research that has employed these methods to investigate social preferences with an emphasis on a key psychological process in social decision-making; namely, self-control. We then highlight future opportunities for the neural trait approach in cutting-edge decision-making research. Finally, we explore the debate about self-control in social decision-making and the potential role neural trait research could play in this issue.
Resumo:
The field of animal syndromic surveillance (SyS) is growing, with many systems being developed worldwide. Now is an appropriate time to share ideas and lessons learned from early SyS design and implementation. Based on our practical experience in animal health SyS, with additions from the public health and animal health SyS literature, we put forward for discussion a 6-step approach to designing SyS systems for livestock and poultry. The first step is to formalise policy and surveillance goals which are considerate of stakeholder expectations and reflect priority issues (1). Next, it is important to find consensus on national priority diseases and identify current surveillance gaps. The geographic, demographic, and temporal coverage of the system must be carefully assessed (2). A minimum dataset for SyS that includes the essential data to achieve all surveillance objectives while minimizing the amount of data collected should be defined. One can then compile an inventory of the data sources available and evaluate each using the criteria developed (3). A list of syndromes should then be produced for all data sources. Cases can be classified into syndrome classes and the data can be converted into time series (4). Based on the characteristics of the syndrome-time series, the length of historic data available and the type of outbreaks the system must detect, different aberration detection algorithms can be tested (5). Finally, it is essential to develop a minimally acceptable response protocol for each statistical signal produced (6). Important outcomes of this pre-operational phase should be building of a national network of experts and collective action and evaluation plans. While some of the more applied steps (4 and 5) are currently receiving consideration, more emphasis should be put on earlier conceptual steps by decision makers and surveillance developers (1-3).
Resumo:
The majority of people who sustain hip fractures after a fall to the side would not have been identified using current screening techniques such as areal bone mineral density. Identifying them, however, is essential so that appropriate pharmacological or lifestyle interventions can be implemented. A protocol, demonstrated on a single specimen, is introduced, comprising the following components; in vitro biofidelic drop tower testing of a proximal femur; high-speed image analysis through digital image correlation; detailed accounting of the energy present during the drop tower test; organ level finite element simulations of the drop tower test; micro level finite element simulations of critical volumes of interest in the trabecular bone. Fracture in the femoral specimen initiated in the superior part of the neck. Measured fracture load was 3760 N, compared to 4871 N predicted based on the finite element analysis. Digital image correlation showed compressive surface strains as high as 7.1% prior to fracture. Voxel level results were consistent with high-speed video data and helped identify hidden local structural weaknesses. We found using a drop tower test protocol that a femoral neck fracture can be created with a fall velocity and energy representative of a sideways fall from standing. Additionally, we found that the nested explicit finite element method used allowed us to identify local structural weaknesses associated with femur fracture initiation.
Resumo:
Neolithic wetland sites in the Swiss Plateau provide an extraordinary database for the study of mobilities, entanglements and transformations in material culture. Based on dendrochronologically dated settlements between 3900 and 3500 BC, two regional pottery styles and their local variations are well known, Pfyn and Cortaillod. The vessels share the same habitus and were made of local raw materials. However, some vessels specific to other pottery styles are also present in the sites. By focusing on itineraries of vessels and shifts in pottery knowledge, their appropriation in different contexts and the resulting material entanglements, we want to approach the multiple regimes of mobility: At Lake Constance - known for Pfyn pottery - specific Michelsberg vessels like tulip beakers and lugged jars occur in small numbers. These travelling objects were produced with exogenous raw materials and transported to the sites from Southern Germany. At Concise (Lake Neuchâtel) besides the local Cortaillod pottery the whole repertoire of NMB pottery, characteristic for Eastern France, was also produced. Further cases from the same space-time frame point to other regimes of mobility. In our two PhD-projects we compare pottery practices - styles, techniques, raw materials - from over 20 key sites in the region. Based on Bourdieu’s reflexive anthropology, we apply different qualitative and quantitative archaeological and archaeometrical methods, thus striving for a deeper understanding of habitus and the transformative potential of moving people, objects and ideas on local and regional scales and related social contexts.
Resumo:
The paper revives a theoretical definition of party coherence as being composed of two basic elements, cohesion and factionalism, to propose and apply a novel empirical measure based on spin physics. The simultaneous analysis of both components using a single measurement concept is applied to data representing the political beliefs of candidates in the Swiss general elections of 2003 and 2007, proposing a connection between the coherence of the beliefs party members hold and the assessment of parties being at risk of splitting. We also compare our measure with established polarization measures and demonstrate its advantage with respect to multi-dimensional data that lack clear structure. Furthermore, we outline how our analysis supports the distinction between bottom-up and top-down mechanisms of party splitting. In this way, we are able to turn the intuition of coherence into a defined quantitative concept that, additionally, offers a methodological basis for comparative research of party coherence. Our work serves as an example of how a complex systems approach allows to get a new perspective on a long-standing issue in political science.
Resumo:
I report on language variation in the unresearched variety of English emerging on Kosrae, Federated States of Micronesia. English is spoken as the inter-island lingua franca throughout Micronesia and has been the official language of FSM since gaining its independence in 1986, though still retaining close ties with the US through and economic “compact” agreement. I present here an analysis of a corpus of over 90 Kosraean English speakers, compiled during a three month fieldwork trip to the island in the Western Pacific. The 45 minute sociolinguistically sensitive recordings are drawn from a corpus of old and young, with varying levels of education and occupations, and off-island experiences. In the paper I analyse two variables. The first variable is the realisation of /h/, often subject to deletion in both L1 and L2 varieties of English. Such occurrences are commonly associated with Cockney English, but also found in Caribbean English and the postcolonial English of Australia. For example: Male, 31: yeah I build their house their local huts and they pay me /h/ deletion is frequent in Kosraean English, but, perhaps expectedly, occurs slightly less among people with higher contact with American English, through having spent longer periods off island. The second feature under scrutiny is the variable epenthesis of [h] to provide a consonantal onset to vowel-initial syllables. Male, 31: that guy is really hold now This practice is also found beyond Kosraean English. Previous studies find h-epenthesis arising in L1 varieties including Newfoundland and Tristan de Cunha English, while similar manifestations are identified in Francophone L2 learners of English. My variationist statistical analysis has shown [h] insertion: to disproportionately occur intervocalically; to be constrained by both speaker gender and age: older males are much more likely to epenthesis [h] in their speech; to be more likely in the onset of stressed as opposed to unstressed syllables. In light of the findings of my analysis, I consider the relationship between h-deletion and h-epenthesis, the plausibility of hypercorrection as a motivation for the variation, and the potential influence of the substrate language, alongside sociolinguistic factors such as attitudes towards the US based on mobility. The analysis sheds light on the extent to which different varieties share this characteristic and the comparability of them in terms of linguistic constraints and attributes. Clarke, S. (2010). Newfoundland and Labrador English. Edinburgh: Edinburgh University Press Hackert, S. (2004). Urban Bahamian Creole: System and Variation. Varieties of English Around the World G32. Amsterdam: Benjamins Milroy, J. (1983). On the Sociolinguistic History of H-dropping in English in Current topics in English historical linguistics: Odense UP
Resumo:
INTRODUCTION Daylight-mediated photodynamic therapy has been shown to be an effective therapy for actinic keratoses (AKs) and a simple and tolerable treatment procedure in three randomized Scandinavian studies and two recent Phase III randomized controlled studies in Australia and Europe. OBJECTIVES To establish consensus recommendations for the use of daylight photodynamic therapy (DL-PDT) using topical methyl aminolaevulinate (MAL) in European patients with AKs. METHODS The DL-PDT consensus recommendations were developed on behalf of the European Society for Photodynamic Therapy in Dermatology and comprised of 10 dermatologists from different European countries with experience in how to treat AK patients with PDT. Consensus was developed based on literature review and experience of the experts in the treatment of AK using DL-PDT. RESULTS The recommendations arising from this panel of experts provide general guidance on the use of DL-PDT as a dermatological procedure with specific guidance regarding patient selection, therapeutic indications, when to treat, pre-treatment skin preparation, MAL application and daylight exposure for patients with AK in different countries of Europe. CONCLUSIONS This consensus recommendation provides a framework for physicians to perform DL-PDT with MAL cream while ensuring efficiency and safety in the treatment of patients with AK in different European countries.
Resumo:
Infiltration is dominantly gravity driven, and a viscous-flow approach was developed. Laminar film flow equilibrates gravity with the viscous force and a constant flow velocity evolves during a period lasting 3/2 times the duration of a constant input rate, qS. Film thickness F and the specific contact area L of the film per unit soil volume are the key parameters. Sprinkler irrigation produced in situ time series of volumetric water contents, θ(z,t), as determined with TDR probes. The wetting front velocity v and the time series of the mobile water content, w(z,t) were deduced from θ(z,t). In vitro steady flow in a core of saturated soil provided volume flux density, q(z,t), and flow velocity, v, as determined from a heat front velocity. The F and L parameters of the in situ and the in vitro experiments were compared. The macropore-flow restriction states that, for a particular permeable medium, the specific contact area L must be independent from qS i.e., dL/dqS = 0. If true, then the relationship of qS ∝ v3/2 could scale a wide range of input rates 0 ≤ qS ≤ saturated hydraulic conductivity, Ksat, into a permeable medium, and kinematic-wave theory would become a versatile tool to deal with non-equilibrium flow. The viscous-flow approach is based on hydromechanical principles similar to Darcy’s law, but currently it is not suited to deduce flow properties from specified individual spatial structures of permeable media.
Resumo:
Architectural decisions can be interpreted as structural and behavioral constraints that must be enforced in order to guarantee overarching qualities in a system. Enforcing those constraints in a fully automated way is often challenging and not well supported by current tools. Current approaches for checking architecture conformance either lack in usability or offer poor options for adaptation. To overcome this problem we analyze the current state of practice and propose an approach based on an extensible, declarative and empirically-grounded specification language. This solution aims at reducing the overall cost of setting up and maintaining an architectural conformance monitoring environment by decoupling the conceptual representation of a user-defined rule from its technical specification prescribed by the underlying analysis tools. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by untrained stakeholders and, at the same time, can be can be automatically processed by a conformance checking validator. Besides addressing the issue of cost, we also investigate opportunities for increasing the value of conformance checking results by assisting the user towards the full alignment of the implementation with respect to its architecture. In particular, we show the benefits of providing actionable results by introducing a technique which automatically selects the optimal repairing solutions by means of simulation and profit-based quantification. We perform various case studies to show how our approach can be successfully adopted to support truly diverse industrial projects. We also investigate the dynamics involved in choosing and adopting a new automated conformance checking solution within an industrial context. Our approach reduces the cost of conformance checking by avoiding the need for an explicit management of the involved validation tools. The user can define rules using a convenient high-level DSL which automatically adapts to emerging analysis requirements. Increased usability and modular customization ensure lower costs and a shorter feedback loop.
Resumo:
Objective. To evaluate the HEADS UP Virtual Molecular Biology Lab, a computer-based simulated laboratory designed to teach advanced high school biology students how to create a mouse model. ^ Design. A randomized clinical control design of forty-four students from two science magnet high schools in Mercedes, Texas was utilized to assess knowledge and skills of molecular laboratory procedures, attitudes towards science and computers as a learning tool, and usability of the program. ^ Measurements. Data was collected using five paper-and-pencil formatted questionnaires and an internal "lab notebook." ^ Results. The Virtual Lab was found to significantly increase student knowledge over time (p<0.005) and with each use (p<0.001) as well as positively increase attitudes towards computers (p<0.001) and skills (p<0.005). No significant differences were seen in science attitude scores.^ Conclusion. These results provide evidence that the HEADS UP Virtual Molecular Biology Lab is a potentially effective educational tool for high school molecular biology education.^
Resumo:
Purpose. To examine the association between living in proximity to Toxics Release Inventory (TRI) facilities and the incidence of childhood cancer in the State of Texas. ^ Design. This is a secondary data analysis utilizing the publicly available Toxics release inventory (TRI), maintained by the U.S. Environmental protection agency that lists the facilities that release any of the 650 TRI chemicals. Total childhood cancer cases and childhood cancer rate (age 0-14 years) by county, for the years 1995-2003 were used from the Texas cancer registry, available at the Texas department of State Health Services website. Setting: This study was limited to the children population of the State of Texas. ^ Method. Analysis was done using Stata version 9 and SPSS version 15.0. Satscan was used for geographical spatial clustering of childhood cancer cases based on county centroids using the Poisson clustering algorithm which adjusts for population density. Pictorial maps were created using MapInfo professional version 8.0. ^ Results. One hundred and twenty five counties had no TRI facilities in their region, while 129 facilities had at least one TRI facility. An increasing trend for number of facilities and total disposal was observed except for the highest category based on cancer rate quartiles. Linear regression analysis using log transformation for number of facilities and total disposal in predicting cancer rates was computed, however both these variables were not found to be significant predictors. Seven significant geographical spatial clusters of counties for high childhood cancer rates (p<0.05) were indicated. Binomial logistic regression by categorizing the cancer rate in to two groups (<=150 and >150) indicated an odds ratio of 1.58 (CI 1.127, 2.222) for the natural log of number of facilities. ^ Conclusion. We have used a unique methodology by combining GIS and spatial clustering techniques with existing statistical approaches in examining the association between living in proximity to TRI facilities and the incidence of childhood cancer in the State of Texas. Although a concrete association was not indicated, further studies are required examining specific TRI chemicals. Use of this information can enable the researchers and public to identify potential concerns, gain a better understanding of potential risks, and work with industry and government to reduce toxic chemical use, disposal or other releases and the risks associated with them. TRI data, in conjunction with other information, can be used as a starting point in evaluating exposures and risks. ^