994 resultados para no costs ordered against liquidator


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to estimate the annual direct medical costs of hospitalizations due to osteoporotic fractures in Switzerland. Days of hospital stay in 1992 were quantified using the casuistic of the medical statistics department of VESKA (Vereinigung Schweizerischer Krankenhäuser, the Swiss Hospital Association), which covers 43% of all hospital beds of that country. Number and incidence of total hospitalizations due to fractures were calculated by extrapolating to 100% the 43% VESKA-selected sample. To estimate number and incidence of hospitalizations due to osteoporotic fractures, internationally accepted age-specific osteoporosis attribution rates were applied. According to the latter the probability of a fracture being caused by osteoporosis increases with age. Mean length of stay for all fractures was calculated (= total hospital days divided by number of cases). By multiplying these mean lengths of stay by the number of osteoporosis-related fracture cases, the number of bed-days due to osteoporotic fractures was calculated. To compare the direct medical costs of hospitalization due to osteoporosis with those due to other frequent diseases, days of hospital stay caused by chronic obstructive pulmonary disease (COPD), stroke, acute myocardial infarction and breast cancer were estimated using the same methodology. A total estimate of 63,170 (f: 33,596, m: 29,574) hospitalizations due to fractures (and other osteoporosis-related diagnoses) was calculated, thus leading to overall annual incidence rates of hospitalizations for fractures of 950/100,000 women and 877/100,000 men. In women, 548,615 hospital days were found to be caused by osteoporosis, 353,654 days by COPD, 352,062 days by stroke, 200,669 days by breast carcinoma and 131,331 days by myocardial infarction. In men, COPD caused more hospitalization days (537,164) than myocardial infarction (196,793), stroke (180,524) or osteoporosis (152,857). Taking a mean price for a hospital day in Switzerland of 845 Swiss francs, the annual costs of acute hospitalizations due to osteoporosis and its complications were approximately 600 million Swiss francs (f: 464, m: 130 million Swiss francs) in 1992. We conclude that there is enough economic evidence to justify wide-scale interventions against osteoporosis in Switzerland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet service providers (ISPs) play a pivotal role in contemporary society because they provide access to the Internet. The primary task of ISPs – to blindly transfer information across the network – has recently come under pressure, as has their status as neutral third parties. Both the public and the private sector have started to require ISPs to interfere with the content placed and transferred on the Internet as well as access to it for a variety of purposes, including the fight against cybercrime, digital piracy, child pornography, etc. This expanding list necessitates a critical assessment of the role of ISPs. This paper analyses the role of the access provider. Particular attention is paid to Dutch case law, in which access providers were forced to block The Pirate Bay. After analysing the position of ISPs, we will define principles that can guide the decisions of ISPs whether to take action after a request to block access based on directness, effectiveness, costs, relevance and time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La temperatura es una preocupación que juega un papel protagonista en el diseño de circuitos integrados modernos. El importante aumento de las densidades de potencia que conllevan las últimas generaciones tecnológicas ha producido la aparición de gradientes térmicos y puntos calientes durante el funcionamiento normal de los chips. La temperatura tiene un impacto negativo en varios parámetros del circuito integrado como el retardo de las puertas, los gastos de disipación de calor, la fiabilidad, el consumo de energía, etc. Con el fin de luchar contra estos efectos nocivos, la técnicas de gestión dinámica de la temperatura (DTM) adaptan el comportamiento del chip en función en la información que proporciona un sistema de monitorización que mide en tiempo de ejecución la información térmica de la superficie del dado. El campo de la monitorización de la temperatura en el chip ha llamado la atención de la comunidad científica en los últimos años y es el objeto de estudio de esta tesis. Esta tesis aborda la temática de control de la temperatura en el chip desde diferentes perspectivas y niveles, ofreciendo soluciones a algunos de los temas más importantes. Los niveles físico y circuital se cubren con el diseño y la caracterización de dos nuevos sensores de temperatura especialmente diseñados para los propósitos de las técnicas DTM. El primer sensor está basado en un mecanismo que obtiene un pulso de anchura variable dependiente de la relación de las corrientes de fuga con la temperatura. De manera resumida, se carga un nodo del circuito y posteriormente se deja flotando de tal manera que se descarga a través de las corrientes de fugas de un transistor; el tiempo de descarga del nodo es la anchura del pulso. Dado que la anchura del pulso muestra una dependencia exponencial con la temperatura, la conversión a una palabra digital se realiza por medio de un contador logarítmico que realiza tanto la conversión tiempo a digital como la linealización de la salida. La estructura resultante de esta combinación de elementos se implementa en una tecnología de 0,35 _m. El sensor ocupa un área muy reducida, 10.250 nm2, y consume muy poca energía, 1.05-65.5nW a 5 muestras/s, estas cifras superaron todos los trabajos previos en el momento en que se publicó por primera vez y en el momento de la publicación de esta tesis, superan a todas las implementaciones anteriores fabricadas en el mismo nodo tecnológico. En cuanto a la precisión, el sensor ofrece una buena linealidad, incluso sin calibrar; se obtiene un error 3_ de 1,97oC, adecuado para tratar con las aplicaciones de DTM. Como se ha explicado, el sensor es completamente compatible con los procesos de fabricación CMOS, este hecho, junto con sus valores reducidos de área y consumo, lo hacen especialmente adecuado para la integración en un sistema de monitorización de DTM con un conjunto de monitores empotrados distribuidos a través del chip. Las crecientes incertidumbres de proceso asociadas a los últimos nodos tecnológicos comprometen las características de linealidad de nuestra primera propuesta de sensor. Con el objetivo de superar estos problemas, proponemos una nueva técnica para obtener la temperatura. La nueva técnica también está basada en las dependencias térmicas de las corrientes de fuga que se utilizan para descargar un nodo flotante. La novedad es que ahora la medida viene dada por el cociente de dos medidas diferentes, en una de las cuales se altera una característica del transistor de descarga |la tensión de puerta. Este cociente resulta ser muy robusto frente a variaciones de proceso y, además, la linealidad obtenida cumple ampliamente los requisitos impuestos por las políticas DTM |error 3_ de 1,17oC considerando variaciones del proceso y calibrando en dos puntos. La implementación de la parte sensora de esta nueva técnica implica varias consideraciones de diseño, tales como la generación de una referencia de tensión independiente de variaciones de proceso, que se analizan en profundidad en la tesis. Para la conversión tiempo-a-digital, se emplea la misma estructura de digitalización que en el primer sensor. Para la implementación física de la parte de digitalización, se ha construido una biblioteca de células estándar completamente nueva orientada a la reducción de área y consumo. El sensor resultante de la unión de todos los bloques se caracteriza por una energía por muestra ultra baja (48-640 pJ) y un área diminuta de 0,0016 mm2, esta cifra mejora todos los trabajos previos. Para probar esta afirmación, se realiza una comparación exhaustiva con más de 40 propuestas de sensores en la literatura científica. Subiendo el nivel de abstracción al sistema, la tercera contribución se centra en el modelado de un sistema de monitorización que consiste de un conjunto de sensores distribuidos por la superficie del chip. Todos los trabajos anteriores de la literatura tienen como objetivo maximizar la precisión del sistema con el mínimo número de monitores. Como novedad, en nuestra propuesta se introducen nuevos parámetros de calidad aparte del número de sensores, también se considera el consumo de energía, la frecuencia de muestreo, los costes de interconexión y la posibilidad de elegir diferentes tipos de monitores. El modelo se introduce en un algoritmo de recocido simulado que recibe la información térmica de un sistema, sus propiedades físicas, limitaciones de área, potencia e interconexión y una colección de tipos de monitor; el algoritmo proporciona el tipo seleccionado de monitor, el número de monitores, su posición y la velocidad de muestreo _optima. Para probar la validez del algoritmo, se presentan varios casos de estudio para el procesador Alpha 21364 considerando distintas restricciones. En comparación con otros trabajos previos en la literatura, el modelo que aquí se presenta es el más completo. Finalmente, la última contribución se dirige al nivel de red, partiendo de un conjunto de monitores de temperatura de posiciones conocidas, nos concentramos en resolver el problema de la conexión de los sensores de una forma eficiente en área y consumo. Nuestra primera propuesta en este campo es la introducción de un nuevo nivel en la jerarquía de interconexión, el nivel de trillado (o threshing en inglés), entre los monitores y los buses tradicionales de periféricos. En este nuevo nivel se aplica selectividad de datos para reducir la cantidad de información que se envía al controlador central. La idea detrás de este nuevo nivel es que en este tipo de redes la mayoría de los datos es inútil, porque desde el punto de vista del controlador sólo una pequeña cantidad de datos |normalmente sólo los valores extremos| es de interés. Para cubrir el nuevo nivel, proponemos una red de monitorización mono-conexión que se basa en un esquema de señalización en el dominio de tiempo. Este esquema reduce significativamente tanto la actividad de conmutación sobre la conexión como el consumo de energía de la red. Otra ventaja de este esquema es que los datos de los monitores llegan directamente ordenados al controlador. Si este tipo de señalización se aplica a sensores que realizan conversión tiempo-a-digital, se puede obtener compartición de recursos de digitalización tanto en tiempo como en espacio, lo que supone un importante ahorro de área y consumo. Finalmente, se presentan dos prototipos de sistemas de monitorización completos que de manera significativa superan la características de trabajos anteriores en términos de área y, especialmente, consumo de energía. Abstract Temperature is a first class design concern in modern integrated circuits. The important increase in power densities associated to recent technology evolutions has lead to the apparition of thermal gradients and hot spots during run time operation. Temperature impacts several circuit parameters such as speed, cooling budgets, reliability, power consumption, etc. In order to fight against these negative effects, dynamic thermal management (DTM) techniques adapt the behavior of the chip relying on the information of a monitoring system that provides run-time thermal information of the die surface. The field of on-chip temperature monitoring has drawn the attention of the scientific community in the recent years and is the object of study of this thesis. This thesis approaches the matter of on-chip temperature monitoring from different perspectives and levels, providing solutions to some of the most important issues. The physical and circuital levels are covered with the design and characterization of two novel temperature sensors specially tailored for DTM purposes. The first sensor is based upon a mechanism that obtains a pulse with a varying width based on the variations of the leakage currents on the temperature. In a nutshell, a circuit node is charged and subsequently left floating so that it discharges away through the subthreshold currents of a transistor; the time the node takes to discharge is the width of the pulse. Since the width of the pulse displays an exponential dependence on the temperature, the conversion into a digital word is realized by means of a logarithmic counter that performs both the timeto- digital conversion and the linearization of the output. The structure resulting from this combination of elements is implemented in a 0.35_m technology and is characterized by very reduced area, 10250 nm2, and power consumption, 1.05-65.5 nW at 5 samples/s, these figures outperformed all previous works by the time it was first published and still, by the time of the publication of this thesis, they outnumber all previous implementations in the same technology node. Concerning the accuracy, the sensor exhibits good linearity, even without calibration it displays a 3_ error of 1.97oC, appropriate to deal with DTM applications. As explained, the sensor is completely compatible with standard CMOS processes, this fact, along with its tiny area and power overhead, makes it specially suitable for the integration in a DTM monitoring system with a collection of on-chip monitors distributed across the chip. The exacerbated process fluctuations carried along with recent technology nodes jeop-ardize the linearity characteristics of the first sensor. In order to overcome these problems, a new temperature inferring technique is proposed. In this case, we also rely on the thermal dependencies of leakage currents that are used to discharge a floating node, but now, the result comes from the ratio of two different measures, in one of which we alter a characteristic of the discharging transistor |the gate voltage. This ratio proves to be very robust against process variations and displays a more than suficient linearity on the temperature |1.17oC 3_ error considering process variations and performing two-point calibration. The implementation of the sensing part based on this new technique implies several issues, such as the generation of process variations independent voltage reference, that are analyzed in depth in the thesis. In order to perform the time-to-digital conversion, we employ the same digitization structure the former sensor used. A completely new standard cell library targeting low area and power overhead is built from scratch to implement the digitization part. Putting all the pieces together, we achieve a complete sensor system that is characterized by ultra low energy per conversion of 48-640pJ and area of 0.0016mm2, this figure outperforms all previous works. To prove this statement, we perform a thorough comparison with over 40 works from the scientific literature. Moving up to the system level, the third contribution is centered on the modeling of a monitoring system consisting of set of thermal sensors distributed across the chip. All previous works from the literature target maximizing the accuracy of the system with the minimum number of monitors. In contrast, we introduce new metrics of quality apart form just the number of sensors; we consider the power consumption, the sampling frequency, the possibility to consider different types of monitors and the interconnection costs. The model is introduced in a simulated annealing algorithm that receives the thermal information of a system, its physical properties, area, power and interconnection constraints and a collection of monitor types; the algorithm yields the selected type of monitor, the number of monitors, their position and the optimum sampling rate. We test the algorithm with the Alpha 21364 processor under several constraint configurations to prove its validity. When compared to other previous works in the literature, the modeling presented here is the most complete. Finally, the last contribution targets the networking level, given an allocated set of temperature monitors, we focused on solving the problem of connecting them in an efficient way from the area and power perspectives. Our first proposal in this area is the introduction of a new interconnection hierarchy level, the threshing level, in between the monitors and the traditional peripheral buses that applies data selectivity to reduce the amount of information that is sent to the central controller. The idea behind this new level is that in this kind of networks most data are useless because from the controller viewpoint just a small amount of data |normally extreme values| is of interest. To cover the new interconnection level, we propose a single-wire monitoring network based on a time-domain signaling scheme that significantly reduces both the switching activity over the wire and the power consumption of the network. This scheme codes the information in the time domain and allows a straightforward obtention of an ordered list of values from the maximum to the minimum. If the scheme is applied to monitors that employ TDC, digitization resource sharing is achieved, producing an important saving in area and power consumption. Two prototypes of complete monitoring systems are presented, they significantly overcome previous works in terms of area and, specially, power consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is evidence that the climate changes and that now, the change is influenced and accelerated by the CO2 augmentation in atmosphere due to combustion by humans. Such ?Climate change? is on the policy agenda at the global level, with the aim of understanding and reducing its causes and to mitigate its consequences. In most countries and international organisms UNO (e.g. Rio de Janeiro 1992), OECD, EC, etc . . . the efforts and debates have been directed to know the possible causes, to predict the future evolution of some variable conditioners, and trying to make studies to fight against the effects or to delay the negative evolution of such. The Protocol of Kyoto 1997 set international efforts about CO2 emissions, but it was partial and not followed e.g. by USA and China . . . , and in Durban 2011 the ineffectiveness of humanity on such global real challenges was set as evident. Among all that, the elaboration of a global model was not boarded that can help to choose the best alternative between the feasible ones, to elaborate the strategies and to evaluate the costs, and the authors propose to enter in that frame for study. As in all natural, technological and social changes, the best-prepared countries will have the best bear and the more rapid recover. In all the geographic areas the alternative will not be the same one, but the model must help us to make the appropriated decision. It is essential to know those areas that are more sensitive to the negative effects of climate change, the parameters to take into account for its evaluation, and comprehensive plans to deal with it. The objective of this paper is to elaborate a mathematical model support of decisions, which will allow to develop and to evaluate alternatives of adaptation to the climatic change of different communities in Europe and Latin-America, mainly in especially vulnerable areas to the climatic change, considering in them all the intervening factors. The models will consider criteria of physical type (meteorological, edaphic, water resources), of use of the ground (agriculturist, forest, mining, industrial, urban, tourist, cattle dealer), economic (income, costs, benefits, infrastructures), social (population), politician (implementation, legislation), educative (Educational programs, diffusion) and environmental, at the present moment and the future. The intention is to obtain tools for aiding to get a realistic position for these challenges, which are an important part of the future problems of humanity in next decades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change is on the policy agenda at the global level, with the aim of understanding and reducing its causes and to mitigate its consequences. In most of the countries and international organisms UNO, OECD, EC, etc … the efforts and debates have been directed to know the possible causes, to predict the future evolution of some variable conditioners, and trying to make studies to fight against the effects or to delay the negative evolution of such. Nevertheless, the elaboration of a global model was not boarded that can help to choose the best alternative between the feasible ones, to elaborate the strategies and to evaluate the costs. As in all natural, technological and social changes, the best-prepared countries will have the best bear and the more rapid recover. In all the geographic areas the alternative will not be the same one, but the model should help us to make the appropriated decision. It is essential to know those areas that are more sensitive to the negative effects of climate change, the parameters to take into account for its evaluation, and comprehensive plans to deal with it. The objective of this paper is to elaborate a mathematical model support of decisions, that will allow to develop and to evaluate alternatives of adaptation to the climatic change of different communities in Europe and Latin-America, mainly, in vulnerable areas to the climatic change, considering in them all the intervening factors. The models will take into consideration criteria of physical type (meteorological, edaphic, water resources), of use of the ground (agriculturist, forest, mining, industrial, urban, tourist, cattle dealer), economic (income, costs, benefits, infrastructures), social (population), politician (implementation, legislation), educative (Educational programs, diffusion), sanitary and environmental, at the present moment and the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dentro del campo de la ciudad como lugar se analiza el concepto de planificación territorial y planeamiento espacial. Flooding is one of the main risks associated to many urban settlements in Spain and, indeed, elsewhere. The location of cities has traditionally ignored this type of risk as other locational criteria prevailed (communications, crop yields, etc.). Defence engineering has been the customary way to offset the risk but, nowadays, the opportunity costs of engineering works in urban areas has highlighted the interest of “soft measures” based on prevention. Early warning systems plus development planning controls rank among the most favoured ones. This paper reflects the results of a recent EU-financed research project on alternative measures geared to the enhancement of urban resilience against flooding. A city study in Spain is used as example of those measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current chemotherapeutic treatment of alveolar echinococcosis (AE) in humans is based on albendazole and/or mebendazole. However, the costs of treatment, life-long consumption of drugs, parasitostatic rather than parasiticidal activity of chemotherapy, and high recurrence rates after treatment interruption warrant more efficient treatment options. Experimental treatment of mice infected with Echinococcus multilocularis metacestodes with fenbendazole revealed similar efficacy to albendazole. Inspection of parasite tissue from infected and benzimidazole-treated mice by transmission electron microscopy (TEM) demonstrated drug-induced alterations within the germinal layer of the parasites, and most notably an almost complete absence of microtriches. On the other hand, upon in vitro exposure of metacestodes to benzimidazoles, no phosphoglucose isomerase activity could be detected in medium supernatants during treatment with any of these drugs, indicating that in vitro treatment did not severely affect the viability of metacestode tissue. Corresponding TEM analysis also revealed a dramatic shortening/retraction of microtriches as a hallmark of benzimidazole action, and as a consequence separation of the acellular laminated layer from the cellular germinal layer. Since TEM did not reveal any microtubule-based structures within Echinococcus microtriches, this effect cannot be explained by the previously described mechanism of action of benzimidazoles targeting β-tubulin, thus benzimidazoles must interact with additional targets that have not been yet identified. In addition, these results indicate the potential usefulness of fenbendazole for the chemotherapy of AE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary objective is to investigate the main factors contributing to GMS expenditure on pharmaceutical prescribing and projecting this expenditure to 2026. This study is located in the area of pharmacoeconomic cost containment and projections literature. The thesis has five main aims: 1. To determine the main factors contributing to GMS expenditure on pharmaceutical prescribing. 2. To develop a model to project GMS prescribing expenditure in five year intervals to 2026, using 2006 Central Statistics Office (CSO) Census data and 2007 Health Service Executive{Primary Care Reimbursement Service (HSE{PCRS) sample data. 3. To develop a model to project GMS prescribing expenditure in five year intervals to 2026, using 2012 HSE{PCRS population data, incorporating cost containment measures, and 2011 CSO Census data. 4. To investigate the impact of demographic factors and the pharmacology of drugs (Anatomical Therapeutic Chemical (ATC)) on GMS expenditure. 5. To explore the consequences of GMS policy changes on prescribing expenditure and behaviour between 2008 and 2014. The thesis is centered around three published articles and is located between the end of a booming Irish economy in 2007, a recession from 2008{2013, to the beginning of a recovery in 2014. The literature identified a number of factors influencing pharmaceutical expenditure, including population growth, population aging, changes in drug utilisation and drug therapies, age, gender and location. The literature identified the methods previously used in predictive modelling and consequently, the Monte Carlo Simulation (MCS) model was used to simulate projected expenditures to 2026. Also, the literature guided the use of Ordinary Least Squares (OLS) regression in determining demographic and pharmacology factors influencing prescribing expenditure. The study commences against a backdrop of growing GMS prescribing costs, which has risen from e250 million in 1998 to over e1 billion by 2007. Using a sample 2007 HSE{PCRS prescribing data (n=192,000) and CSO population data from 2008, (Conway et al., 2014) estimated GMS prescribing expenditure could rise to e2 billion by2026. The cogency of these findings was impacted by the global economic crisis of 2008, which resulted in a sharp contraction in the Irish economy, mounting fiscal deficits resulting in Ireland's entry to a bailout programme. The sustainability of funding community drug schemes, such as the GMS, came under the spotlight of the EU, IMF, ECB (Trioka), who set stringent targets for reducing drug costs, as conditions of the bailout programme. Cost containment measures included: the introduction of income eligibility limits for GP visit cards and medical cards for those aged 70 and over, introduction of co{payments for prescription items, reductions in wholesale mark{up and pharmacy dispensing fees. Projections for GMS expenditure were reevaluated using 2012 HSE{PCRS prescribing population data and CSO population data based on Census 2011. Taking into account both cost containment measures and revised population predictions, GMS expenditure is estimated to increase by 64%, from e1.1 billion in 2016 to e1.8 billion by 2026, (ConwayLenihan and Woods, 2015). In the final paper, a cross{sectional study was carried out on HSE{PCRS population prescribing database (n=1.63 million claimants) to investigate the impact of demographic factors, and the pharmacology of the drugs, on GMS prescribing expenditure. Those aged over 75 (ẞ = 1:195) and cardiovascular prescribing (ẞ = 1:193) were the greatest contributors to annual GMS prescribing costs. Respiratory drugs (Montelukast) recorded the highest proportion and expenditure for GMS claimants under the age of 15. Drugs prescribed for the nervous system (Escitalopram, Olanzapine and Pregabalin) were highest for those between 16 and 64 years with cardiovascular drugs (Statins) were highest for those aged over 65. Females are more expensive than males and are prescribed more items across the four ATC groups, except among children under 11, (ConwayLenihan et al., 2016). This research indicates that growth in the proportion of the elderly claimants and associated levels of cardiovascular prescribing, particularly for statins, will present difficulties for Ireland in terms of cost containment. Whilst policies aimed at cost containment (co{payment charges, generic substitution, reference pricing, adjustments to GMS eligibility) can be used to curtail expenditure, health promotional programs and educational interventions should be given equal emphasis. Also policies intended to affect physicians prescribing behaviour include guidelines, information (about price and less expensive alternatives) and feedback, and the use of budgetary restrictions could yield savings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent interest in replacing tipping with service charges or higher service-inclusive menu pricing prompted this review of empirical evidence on the advantages and disadvantages to restaurants of these different compensation systems. The evidence indicates that these different pricing systems affect the attraction and retention of service workers, the satisfaction of customers with service, the actual and perceived costs of eating out, and the costs of hiring employees and doing business. However, the author comes away from the data believing that the biggest reason for restaurateurs to replace tipping is that the practice takes revenue away from them in the form of lower prices and gives it to servers in the form of excessively high tip income. The biggest reason for restaurateurs to keep tipping is that it allows them to reduce menu prices, which increases demand. Thus, restaurateurs’ decisions to keep voluntary tipping or not should ultimately depend on the relative strengths of these benefits. The more that a restaurant’s servers are overpaid relative to the back of house and the wealthier and less price-sensitive a restaurant’s customers are, the more the owner of that restaurant should consider abandoning tipping. By this reasoning, many upscale, expensive restaurants (especially those in states with no or small tip credits) probably should replace tipping with one of its alternatives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les courriels Spams (courriels indésirables ou pourriels) imposent des coûts annuels extrêmement lourds en termes de temps, d’espace de stockage et d’argent aux utilisateurs privés et aux entreprises. Afin de lutter efficacement contre le problème des spams, il ne suffit pas d’arrêter les messages de spam qui sont livrés à la boîte de réception de l’utilisateur. Il est obligatoire, soit d’essayer de trouver et de persécuter les spammeurs qui, généralement, se cachent derrière des réseaux complexes de dispositifs infectés, ou d’analyser le comportement des spammeurs afin de trouver des stratégies de défense appropriées. Cependant, une telle tâche est difficile en raison des techniques de camouflage, ce qui nécessite une analyse manuelle des spams corrélés pour trouver les spammeurs. Pour faciliter une telle analyse, qui doit être effectuée sur de grandes quantités des courriels non classés, nous proposons une méthodologie de regroupement catégorique, nommé CCTree, permettant de diviser un grand volume de spams en des campagnes, et ce, en se basant sur leur similarité structurale. Nous montrons l’efficacité et l’efficience de notre algorithme de clustering proposé par plusieurs expériences. Ensuite, une approche d’auto-apprentissage est proposée pour étiqueter les campagnes de spam en se basant sur le but des spammeur, par exemple, phishing. Les campagnes de spam marquées sont utilisées afin de former un classificateur, qui peut être appliqué dans la classification des nouveaux courriels de spam. En outre, les campagnes marquées, avec un ensemble de quatre autres critères de classement, sont ordonnées selon les priorités des enquêteurs. Finalement, une structure basée sur le semiring est proposée pour la représentation abstraite de CCTree. Le schéma abstrait de CCTree, nommé CCTree terme, est appliqué pour formaliser la parallélisation du CCTree. Grâce à un certain nombre d’analyses mathématiques et de résultats expérimentaux, nous montrons l’efficience et l’efficacité du cadre proposé.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Vietnam is participating in a global de-worming effort that aims to treat 650 million school children regularly by 2010. The treatment used in Vietnam is single dose oral mebendazole (Phardazone®) 500 mg. We tested the efficacy of single dose mebendazole 500 mg in the therapy of hookworm infection in a randomized double-blind placebo-controlled trial among 271 Vietnamese schoolchildren. The treatment efficacy of single dose mebendazole in children did not differ significantly from placebo, with a reduction in mean eggs per gram of feces relative to placebo of 31% (95% CI - �9 to 56%, P = 0.1). In light of these findings we then carried out a similar randomized trial comparing triple dose mebendazole, single dose albendazole, and triple dose albendazole against placebo in 209 adults in the same area. The estimated reduction in mean post-treatment eggs per gram of feces relative to placebo was 63% (95% CI 30 - 81%) for triple mebendazole, 75% (47 - 88%) for single albendazole, and 88% (58Ã - 97%) for triple albendazole. Our results suggest that single dose oral mebendazole has low efficacy against hookworm infection in Vietnam, and that it should be replaced by albendazole. These findings are of major public health relevance given the opportunity costs of treating entire populations with ineffective therapies. We recommend that efficacy of anti-helminth therapies is pilot tested before implementation of national gut worm control programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The self-ordered pointing test (SOPT; Petrides & Milner, 1982) is a test of non-spatial executive working memory requiring the ability to generate and monitor a sequence of responses. Although used with developmental clinical populations there are few normative data against which to compare atypical performance. Typically developing children (5!11 years) and young adults performed two versions of the SOPT, one using pictures of familiar objects and the other hard-to-verbalise abstract designs. Performance improved with age but the children did not reach adult levels of performance. Participants of all ages found the object condition easier than the abstract condition, suggesting that verbal processes are utilised by the SOPT. However, performance on the task was largely independent from verbal and nonverbal cognitive ability. Overall the results suggest that the SOPT is a sensitive measure of executive working memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper evaluate the hypothesis that race is a determining factor in access to quality employment in Colombia during 2007 -- Using data from the Large Integrated Household Survey (2007-I), we estimate a generalized ordered logit model -- The results provide evidence that individuals self-identified as Afrocolombian have a higher probability of being in a low quality job than other Colombians -- This probability is higher by 1.9% in Cali, 3.4% in Bogotá, 12.6% in Barranquilla, 1.8% in Cartagena, 1.1% in Medellin and 3.8% overall in these five cities, results that could indicate that there is racial discrimination against Afrocolombians in the Colombian labor market

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE: A key objective of A Very Early Rehabilitation Trial is to determine if the intervention, very early mobilisation following stroke, is cost-effective. Resource use data were collected to enable an economic evaluation to be undertaken and a plan for the main economic analyses was written prior to the completion of follow up data collection. AIM AND HYPOTHESIS: To report methods used to collect resource use data, pre-specify the main economic evaluation analyses and report other intended exploratory analyses of resource use data. SAMPLE SIZE ESTIMATES: Recruitment to the trial has been completed. A total of 2,104 participants from 56 stroke units across three geographic regions participated in the trial. METHODS AND DESIGN: Resource use data were collected prospectively alongside the trial using standardised tools. The primary economic evaluation method is a cost-effectiveness analysis to compare resource use over 12 months with health outcomes of the intervention measured against a usual care comparator. A cost-utility analysis is also intended. STUDY OUTCOME: The primary outcome in the cost-effectiveness analysis will be favourable outcome (modified Rankin Scale score 0-2) at 12 months. Cost-utility analysis will use health-related quality of life, reported as quality-adjusted life years gained over a 12 month period, as measured by the modified Rankin Scale and the Assessment of Quality of Life. DISCUSSION: Outcomes of the economic evaluation analysis will inform the cost-effectiveness of very early mobilisation following stroke when compared to usual care. The exploratory analysis will report patterns of resource use in the first year following stroke.