888 resultados para whether binding on non-associated third party payer


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address mid-level vision for the recognition of non-rigid objects. We align model and image using frame curves - which are object or "figure/ground" skeletons. Frame curves are computed, without discontinuities, using Curved Inertia Frames, a provably global scheme implemented on the Connection Machine, based on: non-cartisean networks; a definition of curved axis of inertia; and a ridge detector. I present evidence against frame alignment in human perception. This suggests: frame curves have a role in figure/ground segregation and in fuzzy boundaries; their outside/near/top/ incoming regions are more salient; and that perception begins by setting a reference frame (prior to early vision), and proceeds by processing convex structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kibble, N, ?The Relevance and Admissibility of Prior Sexual History with the Defendant in Sexual Offence Cases? (2001) 32 Cambrian Law Review 27-63 (cited with approval by HL in R v A(2) [2002] AC 45) RAE2008

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Medicina Dentária

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Google AdSense Program is a successful internet advertisement program where Google places contextual adverts on third-party websites and shares the resulting revenue with each publisher. Advertisers have budgets and bid on ad slots while publishers set reserve prices for the ad slots on their websites. Following previous modelling efforts, we model the program as a two-sided market with advertisers on one side and publishers on the other. We show a reduction from the Generalised Assignment Problem (GAP) to the problem of computing the revenue maximising allocation and pricing of publisher slots under a first-price auction. GAP is APX-hard but a (1-1/e) approximation is known. We compute truthful and revenue-maximizing prices and allocation of ad slots to advertisers under a second-price auction. The auctioneer's revenue is within (1-1/e) second-price optimal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been considerable work done in the study of Web reference streams: sequences of requests for Web objects. In particular, many studies have looked at the locality properties of such streams, because of the impact of locality on the design and performance of caching and prefetching systems. However, a general framework for understanding why reference streams exhibit given locality properties has not yet emerged. In this work we take a first step in this direction, based on viewing the Web as a set of reference streams that are transformed by Web components (clients, servers, and intermediaries). We propose a graph-based framework for describing this collection of streams and components. We identify three basic stream transformations that occur at nodes of the graph: aggregation, disaggregation and filtering, and we show how these transformations can be used to abstract the effects of different Web components on their associated reference streams. This view allows a structured approach to the analysis of why reference streams show given properties at different points in the Web. Applying this approach to the study of locality requires good metrics for locality. These metrics must meet three criteria: 1) they must accurately capture temporal locality; 2) they must be independent of trace artifacts such as trace length; and 3) they must not involve manual procedures or model-based assumptions. We describe two metrics meeting these criteria that each capture a different kind of temporal locality in reference streams. The popularity component of temporal locality is captured by entropy, while the correlation component is captured by interreference coefficient of variation. We argue that these metrics are more natural and more useful than previously proposed metrics for temporal locality. We use this framework to analyze a diverse set of Web reference traces. We find that this framework can shed light on how and why locality properties vary across different locations in the Web topology. For example, we find that filtering and aggregation have opposing effects on the popularity component of the temporal locality, which helps to explain why multilevel caching can be effective in the Web. Furthermore, we find that all transformations tend to diminish the correlation component of temporal locality, which has implications for the utility of different cache replacement policies at different points in the Web.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Attributing a dollar value to a keyword is an essential part of running any profitable search engine advertising campaign. When an advertiser has complete control over the interaction with and monetization of each user arriving on a given keyword, the value of that term can be accurately tracked. However, in many instances, the advertiser may monetize arrivals indirectly through one or more third parties. In such cases, it is typical for the third party to provide only coarse-grained reporting: rather than report each monetization event, users are aggregated into larger channels and the third party reports aggregate information such as total daily revenue for each channel. Examples of third parties that use channels include Amazon and Google AdSense. In such scenarios, the number of channels is generally much smaller than the number of keywords whose value per click (VPC) we wish to learn. However, the advertiser has flexibility as to how to assign keywords to channels over time. We introduce the channelization problem: how do we adaptively assign keywords to channels over the course of multiple days to quickly obtain accurate VPC estimates of all keywords? We relate this problem to classical results in weighing design, devise new adaptive algorithms for this problem, and quantify the performance of these algorithms experimentally. Our results demonstrate that adaptive weighing designs that exploit statistics of term frequency, variability in VPCs across keywords, and flexible channel assignments over time provide the best estimators of keyword VPCs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Natural and human-made disasters cause on average 120,000 deaths and over US$140 billion in damage to property and infrastructure every year, with national, regional and international actors consistently responding to the humanitarian imperative to alleviate suffering wherever it may be found. Despite various attempts to codify international disaster laws since the 1920s, a right to humanitarian assistance remains contested, reflecting concerns regarding the relative importance of state sovereignty vis-à-vis individual rights under international law. However, the evolving acquis humanitaire of binding and non-binding normative standards for responses to humanitarian crises highlights the increasing focus on rights and responsibilities applicable in disasters; although the International Law Commission has also noted the difficulty of identifying lex lata and lex ferenda regarding the protection of persons in the event of disasters due to the “amorphous state of the law relating to international disaster response.” Therefore, using the conceptual framework of transnational legal process, this thesis analyses the evolving normative frameworks and standards for rights-holders and duty-bearers in disasters. Determining the process whereby rights are created and evolve, and their potential internalisation into domestic law and policy, provides a powerful analytical framework for examining the progress and challenges of developing accountable responses to major disasters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The impact of aromatase inhibitors (AIs) on non-cancer-related outcomes, which are known to be affected by oestrogens, has become increasingly important in postmenopausal women with hormone-dependent breast cancer. So far, data related to the effect of AIs on lipid profile in postmenopausal women is scarce. This study, as a companion substudy of an EORTC phase II trial (10951), evaluated the impact of exemestane, a steroidal aromatase inactivator, on the lipid profile of postmenopausal metastatic breast cancer (MBC) patients. PATIENTS AND METHODS: The EORTC trial 10951 randomised 122 postmenopausal breast cancer patients to exemestane (E) 25 mg (n = 62) or tamoxifen (T) 20 mg (n = 60) once daily as a first-line treatment in the metastatic setting. Exemestane showed promising results in all the primary efficacy end points of the trial (response rate, clinical benefit rate and response duration), and it was well tolerated with low incidence of serious toxicity. As a secondary end point of this phase II trial, serum triglycerides (TRG), high-density lipoprotein cholesterol (HDL), total cholesterol (TC), lipoprotein a (Lip a), and apolipoproteins (Apo) B and A1 were measured at baseline and while on therapy (at 8, 24 and 48 weeks) to assess the impact of exemestane and tamoxifen on serum lipid profiles. Of the 122 randomised patients, those who had baseline and at least one other lipid assessment are included in the present analysis. The patients who received concomitant drugs that could affect lipid profile are included only if these drugs were administered throughout the study treatment. Increase or decrease in lipid parameters within 20% of baseline were considered as non-significant and thus unchanged. RESULTS: Seventy-two patients (36 in both arms) were included in the statistical analysis. The majority of patients had abnormal TC and normal TRG, HDL, Apo A1, Apo B and Lip a levels at baseline. Neither exemestane nor tamoxifen had adverse effects on TC, HDL, Apo A1, Apo B or Lip a levels at 8, 24 and 48 weeks of treatment. Exemestane and tamoxifen had opposite effects on TRG levels: exemestane lowered while tamoxifen increased TRG levels over time. There were too few patients with normal baseline TC and abnormal TRG, HDL, Apo A1, Apo B and Lip a levels to allow for assessment of E's impact on these subsets. The atherogenic risk determined by Apo A1:Apo B and TC:HDL ratios remained unchanged throughout the treatment period in both the E and T arms. CONCLUSIONS: Overall, exemestane has no detrimental effect on cholesterol levels and the atherogenic indices, which are well-known risk factors for coronary artery disease. In addition, it has a beneficial effect on TRG levels. These data, coupled with E's excellent efficacy and tolerability, support further exploration of its potential in the metastatic, adjuvant and chemopreventive setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the storage and use of residual newborn screening (NBS) samples has gained attention. To inform ongoing policy discussions, this article provides an update of previous work on new policies, educational materials, and parental options regarding the storage and use of residual NBS samples. A review of state NBS Web sites was conducted for information related to the storage and use of residual NBS samples in January 2010. In addition, a review of current statutes and bills introduced between 2005 and 2009 regarding storage and/or use of residual NBS samples was conducted. Fourteen states currently provide information about the storage and/or use of residual NBS samples. Nine states provide parents the option to request destruction of the residual NBS sample after the required storage period or the option to exclude the sample for research uses. In the coming years, it is anticipated that more states will consider policies to address parental concerns about the storage and use of residual NBS samples. Development of new policies regarding storage and use of residual NBS samples will require careful consideration of impact on NBS programs, parent and provider educational materials, and respect for parents among other issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Neurodegenerative diseases (NDD) are characterized by progressive decline and loss of function, requiring considerable third-party care. NDD carers report low quality of life and high caregiver burden. Despite this, little information is available about the unmet needs of NDD caregivers. METHODS: Data from a cross-sectional, whole of population study conducted in South Australia were analyzed to determine the profile and unmet care needs of people who identify as having provided care for a person who died an expected death from NDDs including motor neurone disease and multiple sclerosis. Bivariate analyses using chi(2) were complemented with a regression analysis. RESULTS: Two hundred and thirty respondents had a person close to them die from an NDD in the 5 years before responding. NDD caregivers were more likely to have provided care for more than 2 years and were more able to move on after the death than caregivers of people with other disorders such as cancer. The NDD caregivers accessed palliative care services at the same rate as other caregivers at the end of life, however people with an NDD were almost twice as likely to die in the community (odds ratio [OR] 1.97; 95% confidence interval [CI] 1.30 to 3.01) controlling for relevant caregiver factors. NDD caregivers reported significantly more unmet needs in emotional, spiritual, and bereavement support. CONCLUSION: This study is the first step in better understanding across the whole population the consequences of an expected death from an NDD. Assessments need to occur while in the role of caregiver and in the subsequent bereavement phase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Veterans Health Administration (VHA) in the Department of Veteran Affairs (VA) has emerged as a national and international leader in the delivery and research of telehealth-based treatment. Several unique characteristics of care in VA settings intersect to create an ideal environment for telehealth modalities and research. However, the value of telehealth experience and initiatives in VA settings is limited if telehealth strategies cannot be widely exported to other public or private systems. Whereas a hierarchical organization, such as VA, can innovate and fund change relatively quickly based on provider and patient preferences and a growing knowledge base, other health provider organizations and third-party payers may likely require replicable scientific findings over time before incremental investments will be made to create infrastructure, reform regulatory barriers, and amend laws to accommodate expansion of telehealth modalities. Accordingly, large-scale scientifically rigorous telehealth research in VHA settings is essential not only to investigate the efficacy of existing and future telehealth practices in VHA, but also to hasten the development of telehealth infrastructure in private and other public health settings. We propose an expanded partnership between the VA, NIH, and other funding agencies to investigate creative and pragmatic uses of telehealth technology. To this end, we identify six specific areas of research we believe to be particularly relevant to the efficient development of telehealth modalities in civilian and military contexts outside VHA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALE: Asthma is prospectively associated with age-related chronic diseases and mortality, suggesting the hypothesis that asthma may relate to a general, multisystem phenotype of accelerated aging. OBJECTIVES: To test whether chronic asthma is associated with a proposed biomarker of accelerated aging, leukocyte telomere length. METHODS: Asthma was ascertained prospectively in the Dunedin Multidisciplinary Health and Development Study cohort (n = 1,037) at nine in-person assessments spanning ages 9-38 years. Leukocyte telomere length was measured at ages 26 and 38 years. Asthma was classified as life-course-persistent, childhood-onset not meeting criteria for persistence, and adolescent/adult-onset. We tested associations between asthma and leukocyte telomere length using regression models. We tested for confounding of asthma-leukocyte telomere length associations using covariate adjustment. We tested serum C-reactive protein and white blood cell counts as potential mediators of asthma-leukocyte telomere length associations. MEASUREMENTS AND MAIN RESULTS: Study members with life-course-persistent asthma had shorter leukocyte telomere length as compared with sex- and age-matched peers with no reported asthma. In contrast, leukocyte telomere length in study members with childhood-onset and adolescent/adult-onset asthma was not different from leukocyte telomere length in peers with no reported asthma. Adjustment for life histories of obesity and smoking did not change results. Study members with life-course-persistent asthma had elevated blood eosinophil counts. Blood eosinophil count mediated 29% of the life-course-persistent asthma-leukocyte telomere length association. CONCLUSIONS: Life-course-persistent asthma is related to a proposed biomarker of accelerated aging, possibly via systemic eosinophilic inflammation. Life histories of asthma can inform studies of aging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

© 2015 IEEE.We consider the problem of verification of software implementations of linear time-invariant controllers. Commonly, different implementations use different representations of the controller's state, for example due to optimizations in a third-party code generator. To accommodate this variation, we exploit input-output controller specification captured by the controller's transfer function and show how to automatically verify correctness of C code controller implementations using a Frama-C/Why3/Z3 toolchain. Scalability of the approach is evaluated using randomly generated controller specifications of realistic size.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the 19th century, Frédéric Chopin (1810-1849), Franz Liszt (1811- 1886), and Johannes Brahms (1833-1897) were among the most recognized composers of character pieces. Their compositions have been considered a significant milestone in piano literature. Frédéric Chopin (1810-1849) did not give descriptive titles to his character pieces. He grouped them into several genres such as Mazurkas, Polonaises. His Mazurkas and Polonaises are influenced by Polish dance music and inspired by the polish national idiom. Franz Liszt (1811-1886) was influenced in many ways by Chopin, and adopted Chopin’s lyricism, melodic style, and tempo rubato. However, Liszt frequently drew on non-musical subjects (e.g., art, literature) for inspiration. “Harmonies poétiques et religieuses” and “Années de pèlerinage” are especially representative of character pieces in which poetic and pictorial imagination are reflected. Johannes Brahms (1833-1897) was a conservative traditionalist, synthesizing Romantic expression and Classical tradition remarkably well. Like Chopin, Brahms avoided using programmatic titles for his works. The titles of Brahms’ short character pieces are often taken from traditional lyrical or dramatic genres such as ballade, rhapsody and scherzo. Because of his conservatism, Brahms was considered the main rival of Liszt in the Romantic Period. Brahms character pieces in his third period (e.g., Scherzo Op.4, Ballades of Op.10, and Rhapsodies of Op.79) are concise and focused. The form of Brahms’ character pieces is mostly simple ternary (ABA), and his style is introspective and lyrical. Through this recording project, I was able to get a better understanding of the styles of Chopin, Brahms and Liszt through their character pieces. This recording dissertation consists of two CDs recorded in the Dekelboum Concert Hall at the University of Maryland, College Park. These recordings are documented on compact disc recordings that are housed within the University of Maryland Library System.