932 resultados para Insurance companies, American
Resumo:
This paper presents an easy to use methodology and system for insurance companies targeting at managing traffic accidents reports process. The main objective is to facilitate and accelerate the process of creating and finalizing the necessary accident reports in cases without mortal victims involved. The diverse entities participating in the process from the moment an accident occurs until the related final actions needed are included. Nowadays, this market is limited to the consulting platforms offered by the insurance companies. Copyright 2014 ACM.
Resumo:
What are the ethical and political implications when the very foundations of life —things of awe and spiritual significance — are translated into products accessible to few people? This book critically analyses this historic recontextualisation. Through mediation — when meaning moves ‘from one text to another, from one discourse to another’ — biotechnology is transformed into analysable data and into public discourses. The unique book links biotechnology with media and citizenship. As with any ‘commodity’, biological products have been commodified. Because enormous speculative investment rests on this, risk will be understated and benefit will be overstated. Benefits will be unfairly distributed. Already, the bioprospecting of Southern megadiverse nations, legally sanctioned by U.S. property rights conventions, has led to wealth and health benefits in the North. Crucial to this development are biotechnological discourses that shift meanings from a “language of life” into technocratic discourses, infused with neo-liberal economic assumptions that promise progress and benefits for all. Crucial in this is the mass media’s representation of biotechnology for an audience with poor scientific literacy. Yet, even apparently benign biotechnology spawned by the Human Genome Project such as prenatal screening has eugenic possibilities, and genetic codes for illness are eagerly sought by insurance companies seeking to exclude certain people. These issues raise important questions about a citizenship that is founded on moral responsibility for the wellbeing of society now and into the future. After all, biotechnology is very much concerned with the essence of life itself. This book provides a space for alternative and dissident voices beyond the hype that surrounds biotechnology.
Resumo:
In the medical and healthcare arena, patients‟ data is not just their own personal history but also a valuable large dataset for finding solutions for diseases. While electronic medical records are becoming popular and are used in healthcare work places like hospitals, as well as insurance companies, and by major stakeholders such as physicians and their patients, the accessibility of such information should be dealt with in a way that preserves privacy and security. Thus, finding the best way to keep the data secure has become an important issue in the area of database security. Sensitive medical data should be encrypted in databases. There are many encryption/ decryption techniques and algorithms with regard to preserving privacy and security. Currently their performance is an important factor while the medical data is being managed in databases. Another important factor is that the stakeholders should decide more cost-effective ways to reduce the total cost of ownership. As an alternative, DAS (Data as Service) is a popular outsourcing model to satisfy the cost-effectiveness but it takes a consideration that the encryption/ decryption modules needs to be handled by trustworthy stakeholders. This research project is focusing on the query response times in a DAS model (AES-DAS) and analyses the comparison between the outsourcing model and the in-house model which incorporates Microsoft built-in encryption scheme in a SQL Server. This research project includes building a prototype of medical database schemas. There are 2 types of simulations to carry out the project. The first stage includes 6 databases in order to carry out simulations to measure the performance between plain-text, Microsoft built-in encryption and AES-DAS (Data as Service). Particularly, the AES-DAS incorporates implementations of symmetric key encryption such as AES (Advanced Encryption Standard) and a Bucket indexing processor using Bloom filter. The results are categorised such as character type, numeric type, range queries, range queries using Bucket Index and aggregate queries. The second stage takes the scalability test from 5K to 2560K records. The main result of these simulations is that particularly as an outsourcing model, AES-DAS using the Bucket index shows around 3.32 times faster than a normal AES-DAS under the 70 partitions and 10K record-sized databases. Retrieving Numeric typed data takes shorter time than Character typed data in AES-DAS. The aggregation query response time in AES-DAS is not as consistent as that in MS built-in encryption scheme. The scalability test shows that the DBMS reaches in a certain threshold; the query response time becomes rapidly slower. However, there is more to investigate in order to bring about other outcomes and to construct a secured EMR (Electronic Medical Record) more efficiently from these simulations.
Resumo:
Purpose – The purpose of this paper is to look at auditor obligations to their clients and potentially to third parties such as investors, with a focus on the quality of financial disclosure in an evolving legal framework. Design/methodology/approach – The article outlines and compares established and emerging trends relative to information disclosure and contractual performance in parallel contexts where information asymmetry exists. In particular, this article considers the disclosure regime that has evolved in the insurance industry to address the substantial imbalance in the level of knowledge possessed by the insured in comparison to the prospective insurer. Abductive reasoning is used to identify causal constructs that explain the data pattern from which the theorised potential for judicial revision of the interpretation of “true and fair” in line with “good faith” in legal regulation is derived. Findings – The authors conclude that there is little doubt that a duty of good faith in relation to auditor-company contractual dealings and potentially a broader good faith duty to third parties such as investors in companies may be on the horizon. Originality/value – In the context of stated objectives by organisations such as the International Federation of Accountants to reconcile ethical and technical skills in the wake of the global financial crisis, there is an increased need to rebuild public and investor confidence in the underpinning integrity of financial reporting. This paper offers a perspective on one way to achieve this by recognising the similarities in the information asymmetry relationships in the insurance industry and how the notion of “good faith” in that relationship could be useful in the audit situation.
Resumo:
Nowadays, process management systems (PMSs) are widely used in many business scenarios, e.g. by government agencies, by insurance companies, and by banks. Despite this widespread usage, the typical application of such systems is predominantly in the context of static scenarios, instead of pervasive and highly dynamic scenarios. Nevertheless, pervasive and highly dynamic scenarios could also benefit from the use of PMSs.
Resumo:
In market economies the built environment is largely the product of private sector property development. Property development is a high-risk entrepreneurial activity executing expensive projects with long gestation periods in an uncertain environment and into an uncertain future. Risk lies at the core of development: the developer manages the multiple risks of development and it is the capital injection and financing that is placed at risk. From the developer's perspective the search for development capital is a quest: to access more finance, over a longer term, with fewer conditions and at lower rates. From the supply angle, capital of various sources - banks, insurance companies, superannuation funds, accumulated firm profits, retail investors and private equity - is always seeking above market returns for limited risk. Property development presents one potentially lucrative, but risky, investment opportunity. Competition for returns on capital produces a continual dynamic evolution of methods for funding property developments. And thus the relationship between capital and development and the outcomes for the built environment are in a restless continual evolution. Little is documented about the ways development is financed in Australia and even less of the consequences for cities. Using publicly available data sources and examples of different development financing from Australian practice, this paper argues that different methods of financing development have different outcomes and consequences for the built environment. This paper also presents an agenda for further research into these themes.
Resumo:
Today’s information systems log vast amounts of data. These collections of data (implicitly) describe events (e.g. placing an order or taking a blood test) and, hence, provide information on the actual execution of business processes. The analysis of such data provides an excellent starting point for business process improvement. This is the realm of process mining, an area which has provided a repertoire of many analysis techniques. Despite the impressive capabilities of existing process mining algorithms, dealing with the abundance of data recorded by contemporary systems and devices remains a challenge. Of particular importance is the capability to guide the meaningful interpretation of “oceans of data” by process analysts. To this end, insights from the field of visual analytics can be leveraged. This article proposes an approach where process states are reconstructed from event logs and visualised in succession, leading to an animated history of a process. This approach is customisable in how a process state, partially defined through a collection of activity instances, is visualised: one can select a map and specify a projection of events on this map based on the properties of the events. This paper describes a comprehensive implementation of the proposal. It was realised using the open-source process mining framework ProM. Moreover, this paper also reports on an evaluation of the approach conducted with Suncorp, one of Australia’s largest insurance companies.
Resumo:
This project investigated which aspects of being flooded most affected mental health outcomes. It found that stress in the aftermath of the flood, during the clean-up and rebuilding phase, including stress due to difficulties with insurance companies, was a previously overlooked risk factor, and social support and sense of belonging were the strongest protective factors. Implications for community recovery following disasters include providing effective targeting of support services throughout the lengthy rebuilding phase; the need to co-ordinate tradespeople; and training for insurance company staff aimed at minimising the incidence of insurance company staff inadvertently adding to disaster victims' stress.
Resumo:
Transaction processing is a key constituent of the IT workload of commercial enterprises (e.g., banks, insurance companies). Even today, in many large enterprises, transaction processing is done by legacy "batch" applications, which run offline and process accumulated transactions. Developers acknowledge the presence of multiple loosely coupled pieces of functionality within individual applications. Identifying such pieces of functionality (which we call "services") is desirable for the maintenance and evolution of these legacy applications. This is a hard problem, which enterprises grapple with, and one without satisfactory automated solutions. In this paper, we propose a novel static-analysis-based solution to the problem of identifying services within transaction-processing programs. We provide a formal characterization of services in terms of control-flow and data-flow properties, which is well-suited to the idioms commonly exhibited by business applications. Our technique combines program slicing with the detection of conditional code regions to identify services in accordance with our characterization. A preliminary evaluation, based on a manual analysis of three real business programs, indicates that our approach can be effective in identifying useful services from batch applications.
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
La actividad aseguradora supone la transferencia de riesgos del asegurado al asegurador. El asegurador se compromete al pago de una prestación si el riesgo se realiza. Se produce un cambio en el ciclo productivo. El asegurador vende una cobertura sin conocer el momento y el coste exacto de dicha cobertura. Esta particularidad de la actividad aseguradora explica la necesidad para una entidad aseguradora de ser solvente en cada momento y ante cualquier imprevisto. Por ello, la solvencia de las entidades aseguradoras es un aspecto que se ha ido recogiendo en las distintas normativas que han regulado la actividad aseguradora y al que se ha ido dando cada vez más importancia. Actualmente la legislación vigente en materia de solvencia de las aseguradoras esta regulada por la directiva europea Solvencia I. Esta directiva establece dos conceptos para garantizar la solvencia: las provisiones técnicas y el margen de solvencia. Las provisiones técnicas son las calculadas para garantizar la solvencia estática de la compañía, es decir aquella que hace frente, en un instante temporal determinado, a los compromisos asumidos por la entidad. El margen de solvencia se destina a cubrir la solvencia dinámica, aquella que hace referencia a eventos futuros que puedan afectar la capacidad del asegurador. Sin embargo en una corriente de gestión global del riesgo en la que el sector bancario ya se había adelantado al sector asegurador con la normativa Basilea II, se decidió iniciar un proyecto europeo de reforma de Solvencia I y en noviembre del 2009 se adoptó la directiva 2009/138/CE del parlamento europeo y del consejo, sobre el seguro de vida, el acceso a la actividad de seguro y de reaseguro y su ejercicio mas conocida como Solvencia II. Esta directiva supone un profundo cambio en las reglas actuales de solvencia para las entidades aseguradoras. Este cambio persigue el objetivo de establecer un marco regulador común a nivel europeo que sea más adaptado al perfil de riesgo de cada entidad aseguradora. Esta nueva directiva define dos niveles distintos de capital: el SCR (requerimiento estándar de capital de solvencia) y el MCR (requerimiento mínimo de capital). Para el calculo del SCR se ha establecido que el asegurador tendrá la libertad de elegir entre dos modelos. Un modelo estándar propuesto por la Autoridad Europea de Seguros y Pensiones de Jubilación (EIOPA por sus siglas en inglés), que permitirá un calculo simple, y un modelo interno desarrollado por la propia entidad que deberá ser aprobado por las autoridades competentes. También se contempla la posibilidad de utilizar un modelo mixto que combine ambos, el estándar y el interno. Para el desarrollo del modelo estándar se han realizado una serie de estudios de impacto cuantitativos (QIS). El último estudio (QIS 5) ha sido el que ha planteado de forma más precisa el cálculo del SCR. Plantea unos shocks que se deberán de aplicar al balance de la entidad con el objetivo de estresarlo, y en base a los resultados obtenidos constituir el SCR. El objetivo de este trabajo es realizar una síntesis de las especificaciones técnicas del QIS5 para los seguros de vida y realizar una aplicación práctica para un seguro de vida mixto puro. En la aplicación práctica se determinarán los flujos de caja asociados a este producto para calcular su mejor estimación (Best estimate). Posteriormente se determinará el SCR aplicando los shocks para los riesgos de mortalidad, rescates y gastos. Por último, calcularemos el margen de riesgo asociado al SCR. Terminaremos el presente TFG con unas conclusiones, la bibliografía empleada así como un anexo con las tablas empleadas.
Resumo:
[EN] The aim of this paper is to determine to what extent globalization pressures are changing interlocking directorate networks modeled on continental capitalism into Anglo-Saxon models. For this purpose we analyse the Spanish network of interlocks, comparing the present structure (2012) with that of 1993 and 2006. We show how, although Spanish corporative structure continues to display characteristics of the continental economies, some major banks are significantly reducing industrial activity. Nevertheless, the financial organizations continue to maintain a close relationship with sectors such as construction and services. The analysis of the network of directorates shows a retreat in activity of industrial banking in Spain. Two large Spanish financial institutions, BSCH and La Caixa, continue to undertake activities of industrial banking in 2006, but this activity is significantly reduced in 2012. According to the theories on the role of the interlocking directorates, the companies in these sectors assure their access to banking credit by incorporating advisors from financial organizations to their board of directors. We cannot conclude that the structure of the Spanish corporate network has become a new case of Anglo-Saxon structure, but we got indications that are becoming less hierarchic as banks seem to slowly abandon centrality positions. This is especially salient if we compare the networks of 2006 and 2012, which show a continuing decrease of the role of banks and insurance companies in the network.
Resumo:
Esta dissertação analisou as Empresas Promotoras de Salud (EPS), seguradoras de saúde introduzidas no sistema de saúde colombiano através da reforma sanitária instaurada com a Lei n 100/1993, desde uma perspectiva de economia política crítica, através do método de análise documental. A maioria delas são empresas privadas com finalidade lucrativa que conformaram rapidamente um oligopólio que reproduziu problemas dos modelos de Managed Care e Managed Competition já conhecidos internacionalmente. Esta dissertação analisou as relações entre os processos de financeirização do sistema capitalista e o processo de ajuste estrutural na Colômbia, com a reforma sanitária e a dinâmica financeira das EPS. Também foi analisada a introdução de mecanismos próprios do processo de financeirização na gestão financeira das EPS, como: a alavancagem; a reprodução ampliada de capital através da dívida pública; e os investimentos em ativos securitizados. Dado que o sistema de saúde atual se caracteriza por altos níveis de inequidade e injustiça, as consequências da finalidade lucrativa neste, com suas expressões concretas de sofrimento e morte na população, foram preocupações transversais deste trabalho. Os resultados desta dissertação demonstraram a concentração oligopólica do mercado de seguros privados de saúde, cujas empresas se organizaram como um cartel, dificultando o acesso aos serviços de saúde para seus segurados, o que contribuiu para a piora de indicadores de saúde da população. Quando a mobilização social obrigou a aumentar o controle sobre as EPS, estas começaram a sair do mercado declarando-se em falência, ou entrando subitamente em balanços financeiros negativos.
Resumo:
Genes, species and ecosystems are often considered to be assets. The need to ensure a sufficient diversity of this asset is being increasingly recognised today. Asset managers in banks and insurance companies face a similar challenge. They are asked to manage the assets of their investors by constructing efficient portfolios. They deliberately make use of a phenomenon observed in the formation of portfolios: returns are additive, while risks diversify. This phenomenon and its implications are at the heart of portfolio theory. Portfolio theory, like few other economic theories, has dramatically transformed the practical work of banks and insurance companies. Before portfolio theory was developed about 50 years ago, asset managers were confronted with a situation similar to the situation the research on biodiversity faces today. While the need for diversification was generally accepted, a concept that linked risk and return on a portfolio level and showed the value of diversification was missing. Portfolio theory has closed this gap. This article first explains the fundamentals of portfolio theory and transfers it to biodiversity. A large part of this article is then dedicated to some of the implications portfolio theory has for the valuation and management of biodiversity. The last section introduces three development openings for further research.