830 resultados para IPv6, Denial of Service, Coloured Petri Nets, Risk Analysis, IPv6threats


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The role of temporary ovarian suppression with luteinizing hormone-releasing hormone agonists (LHRHa) in the prevention of chemotherapy-induced premature ovarian failure (POF) is still controversial. Our meta-analysis of randomized, controlled trials (RCTs) investigates whether the use of LHRHa during chemotherapy in premenopausal breast cancer patients reduces treatment-related POF rate, increases pregnancy rate, and impacts disease-free survival (DFS). Methods: A literature search using PubMed, Embase, and the Cochrane Library, and the proceedings of major conferences, was conducted up to 30 April 2015. Odds ratios (ORs) and 95% confidence intervals (CIs) for POF (i.e. POF by study definition, and POF defined as amenorrhea 1 year after chemotherapy completion) and for patients with pregnancy, as well hazard ratios (HRs) and 95% CI for DFS, were calculated for each trial. Pooled analysis was carried out using the fixed- and random-effects models. Results: A total of 12 RCTs were eligible including 1231 breast cancer patients. The use of LHRHa was associated with a significant reduced risk of POF (OR 0.36, 95% CI 0.23-0.57; P < 0.001), yet with significant heterogeneity (I2 = 47.1%, Pheterogeneity = 0.026). In eight studies reporting amenorrhea rates 1 year after chemotherapy completion, the addition of LHRHa reduced the risk of POF (OR 0.55, 95% CI 0.41-0.73, P < 0.001) without heterogeneity (I2 = 0.0%, Pheterogeneity = 0.936). In five studies reporting pregnancies, more patients treated with LHRHa achieved pregnancy (33 versus 19 women; OR 1.83, 95% CI 1.02-3.28, P = 0.041; I2 = 0.0%, Pheterogeneity = 0.629). In three studies reporting DFS, no difference was observed (HR 1.00, 95% CI 0.49-2.04, P = 0.939; I2 = 68.0%, Pheterogeneity = 0.044). Conclusion: Temporary ovarian suppression with LHRHa in young breast cancer patients is associated with a reduced risk of chemotherapy-induced POF and seems to increase the pregnancy rate, without an apparent negative consequence on prognosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Botnets, which consist of thousands of compromised machines, can cause a significant threat to other systems by launching Distributed Denial of Service attacks, keylogging, and backdoors. In response to this threat, new effective techniques are needed to detect the presence of botnets. In this paper, we have used an interception technique to monitor Windows Application Programming Interface system calls made by communication applications. Existing approaches for botnet detection are based on finding bot traffic patterns. Our approach does not depend on finding patterns but rather monitors the change of behaviour in the system. In addition, we will present our idea of detecting botnet based on log correlations from different hosts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensuring the security of computers is a non-trivial task, with many techniques used by malicious users to compromise these systems. In recent years a new threat has emerged in the form of networks of hijacked zombie machines used to perform complex distributed attacks such as denial of service and to obtain sensitive data such as password information. These zombie machines are said to be infected with a dasiahotpsila - a malicious piece of software which is installed on a host machine and is controlled by a remote attacker, termed the dasiabotmaster of a botnetpsila. In this work, we use the biologically inspired dendritic cell algorithm (DCA) to detect the existence of a single hot on a compromised host machine. The DCA is an immune-inspired algorithm based on an abstract model of the behaviour of the dendritic cells of the human body. The basis of anomaly detection performed by the DCA is facilitated using the correlation of behavioural attributes such as keylogging and packet flooding behaviour. The results of the application of the DCA to the detection of a single hot show that the algorithm is a successful technique for the detection of such malicious software without responding to normally running programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 17: Risk Analysis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado Integrado em Medicina Veterinária

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interest rate sensitivity assessment framework based on fixed income yield indexes is developed and applied to two types of emerging market corporate debt: investment grade and high yield exposures. Our research advances beyond the correlation analyses focused on co- movements in yields and/or spreads of risky and risk-free assets. We show that correlation- based analyses of interest rate sensitivity could appear rather inconclusive and, hence, we investigate the bottom line profit and loss of a hypothetical model portfolio of corporates. We consider historical data covering the period 2002 – 2015, which enable us to assess interest rate sensitivity of assets during the development, the apogee, and the aftermath of the global financial crisis. Based on empirical evidence, both for investment and speculative grades securities, we find that the emerging market corporates exhibit two different regimes of sensitivity to interest rate changes. We observe switching from a positive sensitivity under the normal market conditions to a negative one during distressed phases of business cycles. This research sheds light on how financial institutions may approach interest rate risk management, evidencing that even plain vanilla portfolios of emerging market corporates, which on average could appear rather insensitive to the interest rate risk in fact present a binary behavior of their interest rate sensitivities. Our findings allow banks and financial institutions for optimizing economic capital under Basel III regulatory capital rules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With recent advances in remote sensing processing technology, it has become more feasible to begin analysis of the enormous historic archive of remotely sensed data. This historical data provides valuable information on a wide variety of topics which can influence the lives of millions of people if processed correctly and in a timely manner. One such field of benefit is that of landslide mapping and inventory. This data provides a historical reference to those who live near high risk areas so future disasters may be avoided. In order to properly map landslides remotely, an optimum method must first be determined. Historically, mapping has been attempted using pixel based methods such as unsupervised and supervised classification. These methods are limited by their ability to only characterize an image spectrally based on single pixel values. This creates a result prone to false positives and often without meaningful objects created. Recently, several reliable methods of Object Oriented Analysis (OOA) have been developed which utilize a full range of spectral, spatial, textural, and contextual parameters to delineate regions of interest. A comparison of these two methods on a historical dataset of the landslide affected city of San Juan La Laguna, Guatemala has proven the benefits of OOA methods over those of unsupervised classification. Overall accuracies of 96.5% and 94.3% and F-score of 84.3% and 77.9% were achieved for OOA and unsupervised classification methods respectively. The greater difference in F-score is a result of the low precision values of unsupervised classification caused by poor false positive removal, the greatest shortcoming of this method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mechanical fatigue is a failure phenomenon that occurs due to repeated application of mechanical loads. Very High Cycle Fatigue (VHCF) is considered as the domain of fatigue life greater than 10 million load cycles. Increasing numbers of structural components have service life in the VHCF regime, for instance in automotive and high speed train transportation, gas turbine disks, and components of paper production machinery. Safe and reliable operation of these components depends on the knowledge of their VHCF properties. In this thesis both experimental tools and theoretical modelling were utilized to develop better understanding of the VHCF phenomena. In the experimental part, ultrasonic fatigue testing at 20 kHz of cold rolled and hot rolled stainless steel grades was conducted and fatigue strengths in the VHCF regime were obtained. The mechanisms for fatigue crack initiation and short crack growth were investigated using electron microscopes. For the cold rolled stainless steels crack initiation and early growth occurred through the formation of the Fine Granular Area (FGA) observed on the fracture surface and in TEM observations of cross-sections. The crack growth in the FGA seems to control more than 90% of the total fatigue life. For the hot rolled duplex stainless steels fatigue crack initiation occurred due to accumulation of plastic fatigue damage at the external surface, and early crack growth proceeded through a crystallographic growth mechanism. Theoretical modelling of complex cracks involving kinks and branches in an elastic half-plane under static loading was carried out by using the Distributed Dislocation Dipole Technique (DDDT). The technique was implemented for 2D crack problems. Both fully open and partially closed crack cases were analyzed. The main aim of the development of the DDDT was to compute the stress intensity factors. Accuracy of 2% in the computations was attainable compared to the solutions obtained by the Finite Element Method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the problem of automotive cybersecurity from the point of view of Threat Analysis and Risk Assessment (TARA). The central question that motivates the thesis is the one about the acceptability of risk, which is vital in taking a decision about the implementation of cybersecurity solutions. For this purpose, we develop a quantitative framework in which we take in input the results of risk assessment and define measures of various facets of a possible risk response; we then exploit the natural presence of trade-offs (cost versus effectiveness) to formulate the problem as a multi-objective optimization. Finally, we develop a stochastic model of the future evolution of the risk factors, by means of Markov chains; we adapt the formulations of the optimization problems to this non-deterministic context. The thesis is the result of a collaboration with the Vehicle Electrification division of Marelli, in particular with the Cybersecurity team based in Bologna; this allowed us to consider a particular instance of the problem, deriving from a real TARA, in order to test both the deterministic and the stochastic framework in a real world application. The collaboration also explains why in the work we often assume the point of view of a tier-1 supplier; however, the analyses performed can be adapted to any other level of the supply chain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To evaluate the influence of a fluorescent dye (rhodamine B) on the physical and mechanical properties of three different luting cements: a conventional adhesive luting cement (RelyX ARC, 3M/ESPE), a self-adhesive luting cement (RelyX U-200, 3M/ESPE), and a self-etching and self-adhesive luting cement (SeT PP, SDI). The cements were mixed with 0.03 wt% rhodamine B, formed into bar-shaped specimens (n = 10), and light cured using an LED curing unit (Radii, SDI) with a radiant exposure of 32 J/cm(2) . The Knoop hardness (KHN), flexural strength (FS), and Young's modulus (YM) analyses were evaluated after storage for 24 h. Outcomes were subjected to two-way ANOVA and Tukey's test (P = 0.05) for multiple comparisons. No significant differences in FS or YM were observed among the tested groups (P ≥ 0.05); the addition of rhodamine B increased the hardness of the luting cements tested. The addition of a fluorescent agent at 0.03 wt% concentration does not negatively affect the physical-mechanical properties of the luting cement polymerization behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chronic telogen effluvium (CTE), a poorly understood condition, can be confused with or may be a prodrome to female pattern hair loss (FPHL). The pathogenesis of both is related to follicle cycle shortening and possibly to blood supply changes. To analyze a number of histomorphometric and immunohistochemical findings through vascular endothelial growth factor (VEGF), Ki-67, and CD31 immunostaining in scalp biopsies of 20 patients with CTE, 17 patients with mild FPHL and 9 controls. Ki-67 index and VEGF optical density were analyzed at the follicular outer sheath using ImageJ software. CD31 microvessel density was assessed by a Chalkley grid. Significant follicle miniaturization and higher density of nonanagen follicles were found in FPHL, compared with patients with CTE and controls. Ki-67+ index correlated positively with FPHL histological features. The FPHL group showed the highest VEGF optical density, followed by the CTE and control groups. No differences were found in CD31 microvessel density between the three groups. Histomorphometric results establish CTE as a distinct disorder, separate from FPHL from its outset. Its pathogenic mechanisms are also distinct. These findings support the proposed mechanism of 'immediate telogen release' for CTE, leading to cycle synchronization. For FPHL, accelerated anagen follicular mitotic rates and, thus, higher Ki-67 and VEGF values, would leave less time for differentiation, resulting in hair miniaturization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work describes a photo-reactor to perform in line degradation of organic compounds by photo-Fenton reaction using Sequential Injection Analysis (SIA) system. A copper phthalocyanine-3,4',4²,4²¢-tetrasulfonic acid tetrasodium salt dye solution was used as a model compound for the phthalocyanine family, whose pigments have a large use in automotive coatings industry. Based on preliminary tests, 97% of color removal was obtained from a solution containing 20 µmol L-1 of this dye.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Considering the broad variation in the expression of housekeeping genes among tissues and experimental situations, studies using quantitative RT-PCR require strict definition of adequate endogenous controls. For glioblastoma, the most common type of tumor in the central nervous system, there was no previous report regarding this issue. Results: Here we show that amongst seven frequently used housekeeping genes TBP and HPRT1 are adequate references for glioblastoma gene expression analysis. Evaluation of the expression levels of 12 target genes utilizing different endogenous controls revealed that the normalization method applied might introduce errors in the estimation of relative quantities. Genes presenting expression levels which do not significantly differ between tumor and normal tissues can be considered either increased or decreased if unsuitable reference genes are applied. Most importantly, genes showing significant differences in expression levels between tumor and normal tissues can be missed. We also demonstrated that the Holliday Junction Recognizing Protein, a novel DNA repair protein over expressed in lung cancer, is extremely over-expressed in glioblastoma, with a median change of about 134 fold. Conclusion: Altogether, our data show the relevance of previous validation of candidate control genes for each experimental model and indicate TBP plus HPRT1 as suitable references for studies on glioblastoma gene expression.