985 resultados para Prieto, Ibrahim
Resumo:
En El ciego de Quíos, el novelista Antonio Prieto inventa la vida de Homero alimentándola con figuras y hechos procedentes de la Ilíada y, sobre todo, de la Odisea. El relato se enriquece con el conocimiento histórico en torno a los dos grandes poemas, y lo enmarca un amplio despliegue mitológico. Más allá, la novela alcanza un significado trascendente. Descubrimos como motor narrativo el esfuerzo homérico por vencer la fugacidad humana mediante la escritura, lo que crea un vínculo atemporal con el propio anhelo de nuestro autor; así, la novela nos ofrece un ejemplo máximo del fenómeno bautizado como fusión mítica por Antonio Prieto.
Resumo:
"Estudio histórico a que se adjudicó en el certamen escolar nacional celebrado en Zaragoza en marzo de 1897 el premio de honor y el correspondiente al tema X del certamen."
Resumo:
Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.
Resumo:
Lack of a universally accepted and comprehensive taxonomy of cybercrime seriously impedes international efforts to accurately identify, report and monitor cybercrime trends. There is, not surprisingly, a corresponding disconnect internationally on the cybercrime legislation front, a much more serious problem and one which the International Telecommunication Union (ITU) says requires „the urgent attention of all nations‟. Yet, and despite the existence of the Council of Europe Convention on Cybercrime, a proposal for a global cybercrime treaty was rejected by the United Nations (UN) as recently as April 2010. This paper presents a refined and comprehensive taxonomy of cybercrime and demonstrates its utility for widespread use. It analyses how the USA, the UK, Australia and the UAE align with the CoE Convention and finds that more needs to be done to achieve conformance. We conclude with an analysis of the approaches used in Australia, in Queensland, and in the UAE, in Abu Dhabi, to fight cybercrime and identify a number of shared problems.
Resumo:
Fractures of long bones are sometimes treated using various types of fracture fixation devices including internal plate fixators. These are specialised plates which are used to bridge the fracture gap(s) whilst anatomically aligning the bone fragments. The plate is secured in position by screws. The aim of such a device is to support and promote the natural healing of the bone. When using an internal fixation device, it is necessary for the clinician to decide upon many parameters, for example, the type of plate and where to position it; how many and where to position the screws. While there have been a number of experimental and computational studies conducted regarding the configuration of screws in the literature, there is still inadequate information available concerning the influence of screw configuration on fracture healing. Because screw configuration influences the amount of flexibility at the area of fracture, it has a direct influence on the fracture healing process. Therefore, it is important that the chosen screw configuration does not inhibit the healing process. In addition to the impact on the fracture healing process, screw configuration plays an important role in the distribution of stresses in the plate due to the applied loads. A plate that experiences high stresses is prone to early failure. Hence, the screw configuration used should not encourage the occurrence of high stresses. This project develops a computational program in Fortran programming language to perform mathematical optimisation to determine the screw configuration of an internal fixation device within constraints of interfragmentary movement by minimising the corresponding stress in the plate. Thus, the optimal solution suggests the positioning and number of screws which satisfies the predefined constraints of interfragmentary movements. For a set of screw configurations the interfragmentary displacement and the stress occurring in the plate were calculated by the Finite Element Method. The screw configurations were iteratively changed and each time the corresponding interfragmentary displacements were compared with predefined constraints. Additionally, the corresponding stress was compared with the previously calculated stress value to determine if there was a reduction. These processes were continued until an optimal solution was achieved. The optimisation program has been shown to successfully predict the optimal screw configuration in two cases. The first case was a simplified bone construct whereby the screw configuration solution was comparable with those recommended in biomechanical literature. The second case was a femoral construct, of which the resultant screw configuration was shown to be similar to those used in clinical cases. The optimisation method and programming developed in this study has shown that it has potential to be used for further investigations with the improvement of optimisation criteria and the efficiency of the program.
Resumo:
The construction of timelines of computer activity is a part of many digital investigations. These timelines of events are composed of traces of historical activity drawn from system logs and potentially from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work introduces a software tool (CAT Detect) for the detection of inconsistency within timelines of computer activity. We examine the impact of deliberate tampering through experiments conducted with our prototype software tool. Based on the results of these experiments, we discuss techniques which can be employed to deal with such temporal inconsistencies.
Resumo:
Aim: The aim of this pilot study is to describe the use of an Emergency Department (ED) at a large metropolitan teaching hospital by patients who speak English or other languages at home. Methods: All data were retrieved from the Emergency Department Information System (EDIS) of this tertiary teaching hospital in Brisbane. Patients were divided into two groups based on the language spoken at home: patients who speak English only at home (SEO) and patients who do not speak English only or speak other language at home (NSEO). Modes of arrival, length of ED stay and the proportion of hospital admission were compared among the two groups of patients by using SPSS V18 software. Results: A total of 69,494 patients visited this hospital ED in 2009 with 67,727 (97.5%) being in the SEO group and 1,281 (1.80%) in the NSEO group. The proportion of ambulance utilisation in arrival mode was significantly higher among SEO 23,172 (34.2%) than NSEO 397 (31.0%), p <0.05. The NSEO patients had longer length of stay in the ED (M = 337.21, SD = 285.9) compared to SEO patients (M= 290.9, SD = 266.8), with 46.3 minutes (95%CI 62.1, 30.5, p <0.001) difference. The admission to the hospital among NSEO was 402 (31.9%) higher than SEO 17,652 (26.6%), p <0.001. Conclusion: The lower utilisation rates of ambulance services, longer length of ED stay and higher hospital admission rates in NSEO patients compared to SEO patients are consistent with other international studies and may be due to the language barriers.