891 resultados para Linear and multilinear programming
Resumo:
Trust is important in medical relationships and for the achievement of better health outcomes. Developments in managed care in the recent years are believed to affect the quality of healthcare services delivery and to undermine trust in the healthcare provider. Physician choice has been identified as a strong predictor of provider trust but has not been studied in detail. Consumer satisfaction with primary care provider (PCP) choice includes having or not having physician choice. This dissertation developed a conceptual framework that guided the study of consumer satisfaction with PCP choice as a predictor of provider trust, and conducted secondary data analyses examining the association between PCP choice and trust, by identifying factors related to PCP choice satisfaction, and their relative importance in predicting provider trust. The study specific aims were: (1) to determine variables related to the factors: consumer characteristics and health status, information and consumer decision-making, consumer trust in providers in general and trust in the insurer, health plan financing and plan characteristics, and provider characteristics that may relate to PCP choice satisfaction; (2) to determine if the factors in aim one are related to PCP choice satisfaction; and (3) to analyze the association between PCP choice satisfaction and provider trust, controlling for potential confounders. Analyses were based on secondary data from a random national telephone survey in 1999, of residential households in the United States which included respondents aged over 20 and who had at least two visits with a health professional in the past two years. Among 1,117 eligible households interviewed (response rate 51.4%), 564 randomly selected to respond to insurer related questions made up the study sample. Analyses using descriptive statistics, and linear and logistic regressions found continual effective care and interaction with the PCP beyond the medical setting most predictive of PCP choice satisfaction. Four PCP choice satisfaction factors were also predictive of provider trust. Findings highlighted the importance of the PCP's professional and interpersonal competencies for the development of sustainable provider trust. Future research on the access, utilization, cognition, and helpfulness of provider specific information will further our understanding of consumer choice and trust. ^
New methods for quantification and analysis of quantitative real-time polymerase chain reaction data
Resumo:
Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantitation method that has been widely used in the biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle (CT) method, linear and non-linear model fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence is usually inaccurate, and therefore can distort results. Here, we propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtracted the fluorescence in the former cycle from that in the later cycle, transforming the n cycle raw data into n-1 cycle data. Then linear regression was applied to the natural logarithm of the transformed data. Finally, amplification efficiencies and the initial DNA molecular numbers were calculated for each PCR run. To evaluate this new method, we compared it in terms of accuracy and precision with the original linear regression method with three background corrections, being the mean of cycles 1-3, the mean of cycles 3-7, and the minimum. Three criteria, including threshold identification, max R2, and max slope, were employed to search for target data points. Considering that PCR data are time series data, we also applied linear mixed models. Collectively, when the threshold identification criterion was applied and when the linear mixed model was adopted, the taking-difference linear regression method was superior as it gave an accurate estimation of initial DNA amount and a reasonable estimation of PCR amplification efficiencies. When the criteria of max R2 and max slope were used, the original linear regression method gave an accurate estimation of initial DNA amount. Overall, the taking-difference linear regression method avoids the error in subtracting an unknown background and thus it is theoretically more accurate and reliable. This method is easy to perform and the taking-difference strategy can be extended to all current methods for qPCR data analysis.^
Resumo:
Monthly delta18O records of 2 coral colonies (Porites cf. lutea and P. cf. nodifera) from different localities (Aqaba and Eilat) from the northern Gulf of Aqaba, Red Sea, were calibrated with recorded sea surface temperatures (SST) between 1988 and 2000. The results show high correlation coefficients between SST and delta18O. Seasonal variations of coral delta18O in both locations could explain 91% of the recorded SST. Different delta18O/SST relations from both colonies and from the same colonies were obtained, indicating that delta18O from coral skeletons were subject to an extension rate effect. Significant delta18O depletions are associated with high extension rates and higher values with low extension rates. The relation between coral skeletal delta18O and extension rate is not linear and can be described by a simple exponential model. An inverse relationship extends over extension rates from 1 to 5 mm/yr, while for more rapidly growing corals and portions of colonies the relation is constant and the extension rate does not appear to have a significant effect. We recommend that delta18O values be obtained from fast-growing corals or from portions in which the isotopic disequilibrium is fairly constant (extension rate >5 mm/yr). The results show that interspecific differences in corals may produce a significant delta18O profile offset between 2 colonies that is independent of environmental and extension-rate effects. We conclude that the rate of skeletal extension and the species of coral involved have an important influence on coral delta18O and must be considered when using delta18O records for paleoclimatic reconstructions.
Resumo:
Hydrocarbons, sterols and alkenones were analyzed in samples collected from a 10 month sediment trap time series deployed in the Indian Ocean sector of the Southern Ocean. Fluxes and within-class distributions varied seasonally. During higher mass and organic carbon (OC) flux periods, which occurred in austral summer and fall, fresh marine inputs were predominant. Vertical fluxes were most intense in January, but limited to one week in duration. They were, however, low compared with other oceanic regions. In contrast, low mass and OC flux periods were characterized by a strong unresolved complex mixture (UCM) in the hydrocarbon fraction and a high proportion of stanols as a result of zooplanktonic grazing. Terrigenous inputs were not detectable. The alkenone compositions were consistent with previous data on suspended particles from Antarctic waters. However, UK'37 values diverged from the linear and exponential fits established by Sikes et al. (1997, doi:10.1016/S0016-7037(97)00017-3) in the low temperature range. The seasonal pattern of alkenone production implied that IPT (integrated production temperature) is likely to be strongly imprinted by austral summer and fall SST (sea surface temperature).
Resumo:
A reliable data set of Arctic sea ice concentration based on satellite observations exists since 1972. Over this time period of 36 years western arctic temperatures have increased; the temperature rise varies significantly from one season to another and over multi-year time scales. In contrast to most of Alaska, however, on the North Slope the warming continued after 1976, when a circulation change occurred, as expressed in the PDO index. The mean temperature increase for Barrow over the 36-year period was 2.9°C, a very substantial change. Wind speeds increased by 18% over this time period, however, the increase were non-linear and showed a peak in the early 1990s. The sea ice extent of the Arctic Ocean has decreased strongly in recent years, and in September 2007 a new record in the amount of open water was recorded in the Western Arctic. We observed for the Southern Beaufort Sea a fairly steady increase in the mean annual amount of open water from 14% in 1972 to 39% in 2007, as deduced from the best linear fit. In late summer the decrease is much larger, and September has, on average, the least ice concentration (22%), followed by August (35%) and October (54%). The correlation coefficient between mean annual values of temperature and sea ice concentration was 0.84. On a monthly basis, the best correlation coefficient was found in October with 0.88. However, the relationship between winter temperatures and the sea ice break-up in summer was weak. While the temperature correlated well with the CO2 concentration (r=0.86), the correlation coefficient between CO2 and sea ice was lower (r=-0.68). After comparing the ice concentration with 17 circulation indices, the best relation was found with the Pacific Circulation Index (r=-0.59).
Resumo:
Anthropogenic CO2 emissions have exacerbated two environmental stressors, global climate warming and ocean acidification (OA), that have serious implications for marine ecosystems. Coral reefs are vulnerable to climate change yet few studies have explored the potential for interactive effects of warming temperature and OA on an important coral reef calcifier, crustose coralline algae (CCA). Coralline algae serve many important ecosystem functions on coral reefs and are one of the most sensitive organisms to ocean acidification. We investigated the effects of elevated pCO2 and temperature on calcification of Hydrolithon onkodes, an important species of reef-building coralline algae, and the subsequent effects on susceptibility to grazing by sea urchins. H. onkodes was exposed to a fully factorial combination of pCO2 (420, 530, 830 µatm) and temperature (26, 29 °C) treatments, and calcification was measured by the change in buoyant weight after 21 days of treatment exposure. Temperature and pCO2 had a significant interactive effect on net calcification of H. onkodes that was driven by the increased calcification response to moderately elevated pCO2. We demonstrate that the CCA calcification response was variable and non-linear, and that there was a trend for highest calcification at ambient temperature. H. onkodes then was exposed to grazing by the sea urchin Echinothrix diadema, and grazing was quantified by the change in CCA buoyant weight from grazing trials. E. diadema removed 60% more CaCO3 from H. onkodes grown at high temperature and high pCO2 than at ambient temperature and low pCO2. The increased susceptibility to grazing in the high pCO2 treatment is among the first evidence indicating the potential for cascading effects of OA and temperature on coral reef organisms and their ecological interactions.
Resumo:
A series of long-chain (C37, C38, C39), primarily di and tri-unsaturated methyl and ethyl ketones, first identified in sediments from Walvis Ridge off West Africa and from Black Sea (de Leeuw et al., 1979), has been found in marine sediments throughout the world (Brassell et al., 1986 doi:10.1038/320129a0). The marine coccolithophorid Emiliania huxleyi and members of the class Prymnesiophyceae are now the recognized sources of these compounds (Volkman et al., 1979; Marlowe, et al., 1984). Experiments with laboratory cultures of algae showed the degree of unsaturation in the ketone seris biosynthesized depends on growth temperature (Brassell et al., 1986; Marlowe, 1984), a physiological respons observed for classical membrane lipids (vanDeenen et al., 1972). Brassell and co-workers (Brassell et al., 198; Brassell et al., 1986b) thus proposed that systematic fluctuations in the unsaturation of these alkenones noted down-core in sediments from the Kane Gap region of the north-east tropical Atlantic Ocean and correlated with glacial-interglacial cycles provide an organic geochemical measure of past sea-surface water temperatures. Using laboratory cultures of E. huxleyi, we have calibrated changes in the unsaturation pattern of the long-chain ketone series versus growth temperature. The calibration curve is linear and accurtely predicts unsuturation patterns observed in natural particulate materials collected from oceanic waters of known temperature. We present evidence supporting the proposed paleotemperature hypothesis (Brassell et al., 1986, Brassel et al., 1986b) and suggesting absolute 'sea-surface temperatures' for a given oceanic location can be estimated from an analysis of long-chain ketone compositions preserved in glacial and interglacial horizons of deep-sea sediment cores.
Resumo:
A large scale Chinese agricultural survey was conducted at the direction of John Lossing Buck from 1929 through 1933. At the end of the 1990’s, some parts of the original micro data of Buck’s survey were discovered at Nanjing Agricultural University. An international joint study was begun to restore micro data of Buck’s survey and construct parts of the micro database on both the crop yield survey and special expenditure survey. This paper includes a summary of the characteristics of farmlands and cropping patterns in crop yield micro data that covered 2,102 farmers in 20 counties of 9 provinces. In order to test the classical hypothesis of whether or not an inverse relationship between land productivity and cultivated area may be observed in developing countries, a Box-Cox transformation test was conducted for functional forms on five main crops of Buck’s crop yield survey. The result of the test shows that the relationship between land productivity and cultivated areas of wheat and barley is linear and somewhat negative; those of rice, rapeseed, and seed cotton appear to be slightly positive. It can be tentatively concluded that the relationship between cultivated area and land productivity are not the same among crops, and the difference of labor intensity and the level of commercialization of each crop may be strongly related to the existence or non-existence of inverse relationships.
Resumo:
PIV and photographic recording are used to measure the velocity of the fresh gas and the shape of the reaction layer in a region around the tip of a methane-air Bunsen flame attached to a cylindrical burner. The results compare well with numerical simulations carried out with an infinite activation energy reaction model. The experimental and numerical results confirm that the well-known linear relation between flame velocity and flame stretch derived from asymptotic theory for weakly curved and strained flames is valid for small and moderate values of the flame stretch if the modified definition of stretch introduced by Echekki and Mungal (Proc Combust Inst 23:455–461, 1990) and Poinsot et al. (Combust Sci Technol 81:45–73, 1992) is used. However, the relation between flame velocity and modified stretch ceases to be linear and approaches a square root law for large values of the stretch, when the curvature of the flame tip becomes large compared to the inverse of the thickness of a planar flame.
Resumo:
With the rising prices of the retail electricity and the decreasing cost of the PV technology, grid parity with commercial electricity will soon become a reality in Europe. This fact, together with less attractive PV feed-in-tariffs in the near future and incentives to promote self-consumption suggest, that new operation modes for the PV Distributed Generation should be explored; differently from the traditional approach which is only based on maximizing the exported electricity to the grid. The smart metering is experiencing a growth in Europe and the United States but the possibilities of its use are still uncertain, in our system we propose their use to manage the storage and to allow the user to know their electrical power and energy balances. The ADSM has many benefits studied previously but also it has important challenges, in this paper we can observe and ADSM implementation example where we propose a solution to these challenges. In this paper we study the effects of the Active Demand-Side Management (ADSM) and storage systems in the amount of consumed local electrical energy. It has been developed on a prototype of a self-sufficient solar house called “MagicBox” equipped with grid connection, PV generation, lead–acid batteries, controllable appliances and smart metering. We carried out simulations for long-time experiments (yearly studies) and real measures for short and mid-time experiments (daily and weekly studies). Results show the relationship between the electricity flows and the storage capacity, which is not linear and becomes an important design criterion.
Resumo:
Nondeterminism and partially instantiated data structures give logic programming expressive power beyond that of functional programming. However, functional programming often provides convenient syntactic features, such as having a designated implicit output argument, which allow function cali nesting and sometimes results in more compact code. Functional programming also sometimes allows a more direct encoding of lazy evaluation, with its ability to deal with infinite data structures. We present a syntactic functional extensión, used in the Ciao system, which can be implemented in ISO-standard Prolog systems and covers function application, predefined evaluable functors, functional definitions, quoting, and lazy evaluation. The extensión is also composable with higher-order features and can be combined with other extensions to ISO-Prolog such as constraints. We also highlight the features of the Ciao system which help implementation and present some data on the overhead of using lazy evaluation with respect to eager evaluation.
Resumo:
A number of data description languages initially designed as standards for trie WWW are currently being used to implement user interfaces to programs. This is done independently of whether such programs are executed in the same or a different host as trie one running the user interface itself. The advantage of this approach is that it provides a portable, standardized, and easy to use solution for the application programmer, and a familiar behavior for the user, typically well versed in the use of WWW browsers. Among the proposed standard description languages, VRML is a aimed at representing three dimensional scenes including hyperlink capabilities. VRML is already used as an import/export format in many 3-D packages and tools, and has been shown effective in displaying complex objects and scenarios. We propose and describe a Prolog library which allows parsing and checking VRML code, transforming it, and writing it out as VRML again. The library converts such code to an internal representation based on first order terms which can then be arbitrarily manipulated. We also present as an example application the use of this library to implement a novel 3-D visualization for examining and understanding certain aspects of the behavior of CLP(FD) programs.
Resumo:
We present a parallel graph narrowing machine, which is used to implement a functional logic language on a shared memory multiprocessor. It is an extensión of an abstract machine for a purely functional language. The result is a programmed graph reduction machine which integrates the mechanisms of unification, backtracking, and independent and-parallelism. In the machine, the subexpressions of an expression can run in parallel. In the case of backtracking, the structure of an expression is used to avoid the reevaluation of subexpressions as far as possible. Deterministic computations are detected. Their results are maintained and need not be reevaluated after backtracking.
Resumo:
This paper presents a technique for achieving a class of optimizations related to the reduction of checks within cycles. The technique uses both Program Transformation and Abstract Interpretation. After a ñrst pass of an abstract interpreter which detects simple invariants, program transformation is used to build a hypothetical situation that simpliñes some predicates that should be executed within the cycle. This transformation implements the heuristic hypothesis that once conditional tests hold they may continué doing so recursively. Specialized versions of predicates are generated to detect and exploit those cases in which the invariance may hold. Abstract interpretation is then used again to verify the truth of such hypotheses and conñrm the proposed simpliñcation. This allows optimizations that go beyond those possible with only one pass of the abstract interpreter over the original program, as is normally the case. It also allows selective program specialization using a standard abstract interpreter not speciñcally designed for this purpose, thus simplifying the design of this already complex module of the compiler. In the paper, a class of programs amenable to such optimization is presented, along with some examples and an evaluation of the proposed techniques in some application áreas such as floundering detection and reducing run-time tests in automatic logic program parallelization. The analysis of the examples presented has been performed automatically by an implementation of the technique using existing abstract interpretation and program transformation tools.
Resumo:
Most implementations of parallel logic programming rely on complex low-level machinery which is arguably difflcult to implement and modify. We explore an alternative approach aimed at taming that complexity by raising core parts of the implementation to the source language level for the particular case of and-parallelism. Therefore, we handle a signiflcant portion of the parallel implementation mechanism at the Prolog level with the help of a comparatively small number of concurrency-related primitives which take care of lower-level tasks such as locking, thread management, stack set management, etc. The approach does not eliminate altogether modiflcations to the abstract machine, but it does greatly simplify them and it also facilitates experimenting with different alternatives. We show how this approach allows implementing both restricted and unrestricted (i.e., non fork-join) parallelism. Preliminary experiments show that the amount of performance sacriflced is reasonable, although granularity control is required in some cases. Also, we observe that the availability of unrestricted parallelism contributes to better observed speedups.