908 resultados para Security-critical software


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The scope of this paper was to analyze the association between homicides and public security indicators in Sao Paulo between 1996 and 2008, after monitoring the unemployment rate and the proportion of youths in the population. A time-series ecological study for 1996 and 2008 was conducted with Sao Paulo as the unit of analysis. Dependent variable: number of deaths by homicide per year. Main independent variables: arrest-incarceration rate, access to firearms, police activity. Data analysis was conducted using Stata. IC 10.0 software. Simple and multivariate negative binomial regression models were created. Deaths by homicide and arrest-incarceration, as well as police activity were significantly associated in simple regression analysis. Access to firearms was not significantly associated to the reduction in the number of deaths by homicide (p>0,05). After adjustment, the associations with both the public security indicators were not significant. In Sao Paulo the role of public security indicators are less important as explanatory factors for a reduction in homicide rates, after adjustment for unemployment rate and a reduction in the proportion of youths. The results reinforce the importance of socioeconomic and demographic factors for a change in the public security scenario in Sao Paulo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new generation of multicore processors opens new perspectives for the design of embedded systems. Multiprocessing, however, poses new challenges to the scheduling of real-time applications, in which the ever-increasing computational demands are constantly flanked by the need of meeting critical time constraints. Many research works have contributed to this field introducing new advanced scheduling algorithms. However, despite many of these works have solidly demonstrated their effectiveness, the actual support for multiprocessor real-time scheduling offered by current operating systems is still very limited. This dissertation deals with implementative aspects of real-time schedulers in modern embedded multiprocessor systems. The first contribution is represented by an open-source scheduling framework, which is capable of realizing complex multiprocessor scheduling policies, such as G-EDF, on conventional operating systems exploiting only their native scheduler from user-space. A set of experimental evaluations compare the proposed solution to other research projects that pursue the same goals by means of kernel modifications, highlighting comparable scheduling performances. The principles that underpin the operation of the framework, originally designed for symmetric multiprocessors, have been further extended first to asymmetric ones, which are subjected to major restrictions such as the lack of support for task migrations, and later to re-programmable hardware architectures (FPGAs). In the latter case, this work introduces a scheduling accelerator, which offloads most of the scheduling operations to the hardware and exhibits extremely low scheduling jitter. The realization of a portable scheduling framework presented many interesting software challenges. One of these has been represented by timekeeping. In this regard, a further contribution is represented by a novel data structure, called addressable binary heap (ABH). Such ABH, which is conceptually a pointer-based implementation of a binary heap, shows very interesting average and worst-case performances when addressing the problem of tick-less timekeeping of high-resolution timers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most undervalued problems by smartphone users is the security of data on their mobile devices. Today smartphones and tablets are used to send messages and photos and especially to stay connected with social networks, forums and other platforms. These devices contain a lot of private information like passwords, phone numbers, private photos, emails, etc. and an attacker may choose to steal or destroy this information. The main topic of this thesis is the security of the applications present on the most popular stores (App Store for iOS and Play Store for Android) and of their mechanisms for the management of security. The analysis is focused on how the architecture of the two systems protects users from threats and highlights the real presence of malware and spyware in their respective application stores. The work described in subsequent chapters explains the study of the behavior of 50 Android applications and 50 iOS applications performed using network analysis software. Furthermore, this thesis presents some statistics about malware and spyware present on the respective stores and the permissions they require. At the end the reader will be able to understand how to recognize malicious applications and which of the two systems is more suitable for him. This is how this thesis is structured. The first chapter introduces the security mechanisms of the Android and iOS platform architectures and the security mechanisms of their respective application stores. The Second chapter explains the work done, what, why and how we have chosen the tools needed to complete our analysis. The third chapter discusses about the execution of tests, the protocol followed and the approach to assess the “level of danger” of each application that has been checked. The fourth chapter explains the results of the tests and introduces some statistics on the presence of malicious applications on Play Store and App Store. The fifth chapter is devoted to the study of the users, what they think about and how they might avoid malicious applications. The sixth chapter seeks to establish, following our methodology, what application store is safer. In the end, the seventh chapter concludes the thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Questa tesi ha l’obiettivo di comprendere e valutare se l’approccio al paradigma SDN, che verrà spiegato nel capitolo 1, può essere utilizzato efficacemente per implementare dei sistemi atti alla protezione e alla sicurezza di una rete più o meno estesa. Oltre ad introdurre il paradigma SDN con i relativi componenti basilari, si introduce il protocollo fondamentale OpenFlow, per la gestione dei vari componenti. Per ottenere l’obiettivo prestabilito, si sono seguiti alcuni passaggi preliminari. Primo tra tutti si è studiato cos’è l’SDN. Esso introduce una potenziale innovazione nell’utilizzo della rete. La combinazione tra la visione globale di tutta la rete e la programmabilità di essa, rende la gestione del traffico di rete un processo abbastanza complicato in termini di livello applicativo, ma con un risultato alquanto performante in termini di flessibilità. Le alterazioni all’architettura di rete introdotte da SDN devono essere valutate per garantire che la sicurezza di rete sia mantenuta. Le Software Defined Network (come vedremo nei primi capitoli) sono in grado di interagire attraverso tutti i livelli del modello ISO/OSI e questa loro caratteristica può creare problemi. Nelle reti odierne, quando si agisce in un ambiente “confinato”, è facile sia prevedere cosa potrebbe accadere, che riuscire a tracciare gli eventi meno facilmente rilevabili. Invece, quando si gestiscono più livelli, la situazione diventa molto più complessa perché si hanno più fattori da gestire, la variabilità dei casi possibili aumenta fortemente e diventa più complicato anche distinguere i casi leciti da quelli illeciti. Sulla base di queste complicazioni, ci si è chiesto se SDN abbia delle problematiche di sicurezza e come potrebbe essere usato per la sicurezza. Per rispondere a questo interrogativo si è fatta una revisione della letteratura a riguardo, indicando, nel capitolo 3, alcune delle soluzioni che sono state studiate. Successivamente si sono chiariti gli strumenti che vengono utilizzati per la creazione e la gestione di queste reti (capitolo 4) ed infine (capitolo 5) si è provato ad implementare un caso di studio per capire quali sono i problemi da affrontare a livello pratico. Successivamente verranno descritti tutti i passaggi individuati in maniera dettagliata ed alla fine si terranno alcune conclusioni sulla base dell’esperienza svolta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I problemi di sicurezza nel software sono in crescita e gli strumenti di analisi adottati nei sistemi GNU/Linux non permettono di evidenziare le finestre di vulnerabilità a cui un pacchetto è stato soggetto. L'obiettivo di questa tesi è quello di sviluppare uno strumento di computer forensics in grado di ricostruire, incrociando informazioni ottenute dal package manager con security advisory ufficiali, i problemi di sicurezza che potrebbero aver causato una compromissione del sistema in esame.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Physiologic data display is essential to decision making in critical care. Current displays echo first-generation hemodynamic monitors dating to the 1970s and have not kept pace with new insights into physiology or the needs of clinicians who must make progressively more complex decisions about their patients. The effectiveness of any redesign must be tested before deployment. Tools that compare current displays with novel presentations of processed physiologic data are required. Regenerating conventional physiologic displays from archived physiologic data is an essential first step. OBJECTIVES: The purposes of the study were to (1) describe the SSSI (single sensor single indicator) paradigm that is currently used for physiologic signal displays, (2) identify and discuss possible extensions and enhancements of the SSSI paradigm, and (3) develop a general approach and a software prototype to construct such "extended SSSI displays" from raw data. RESULTS: We present Multi Wave Animator (MWA) framework-a set of open source MATLAB (MathWorks, Inc., Natick, MA, USA) scripts aimed to create dynamic visualizations (eg, video files in AVI format) of patient vital signs recorded from bedside (intensive care unit or operating room) monitors. Multi Wave Animator creates animations in which vital signs are displayed to mimic their appearance on current bedside monitors. The source code of MWA is freely available online together with a detailed tutorial and sample data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, the National Survey of Student Engagement (NSSE) serves as a nodal point through which to examine the power relations shaping the direction and practices of higher education in the twenty-first century. Theoretically, my analysis is informed by Foucault’s concept of governmentality, briefly defined as a technology of power that influences or shapes behavior from a distance. This form of governance operates through apparatuses of security, which include higher education. Foucault identified three essential characteristics of an apparatus—the market, the milieu, and the processes of normalization—through which administrative mechanisms and practices operate and govern populations. In this project, my primary focus is on the governance of faculty and administrators, as a population, at residential colleges and universities. I argue that the existing milieu of accountability is one dominated by the neoliberal assumption that all activity—including higher education—works best when governed by market forces alone, reducing higher education to a market-mediated private good. Under these conditions, what many in the academy believe is an essential purpose of higher education—to educate students broadly, to contribute knowledge for the public good, and to serve as society’s critic and social conscience (Washburn 227)—is being eroded. Although NSSE emerged as a form of resistance to commercial college rankings, it did not challenge the forces that empowered the rankings in the first place. Indeed, NSSE data are now being used to make institutions even more responsive to market forces. Furthermore, NSSE’s use has a normalizing effect that tends to homogenize classroom practices and erode the autonomy of faculty in the educational process. It also positions students as part of the system of surveillance. In the end, if aspects of higher education that are essential to maintaining a civil society are left to be defined solely in market terms, the result may be a less vibrant and, ultimately, a less just society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To test the feasibility of and interactions among three software-driven critical care protocols. DESIGN: Prospective cohort study. SETTING: Intensive care units in six European and American university hospitals. PATIENTS: 174 cardiac surgery and 41 septic patients. INTERVENTIONS: Application of software-driven protocols for cardiovascular management, sedation, and weaning during the first 7 days of intensive care. MEASUREMENTS AND RESULTS: All protocols were used simultaneously in 85% of the cardiac surgery and 44% of the septic patients, and any one of the protocols was used for 73 and 44% of study duration, respectively. Protocol use was discontinued in 12% of patients by the treating clinician and in 6% for technical/administrative reasons. The number of protocol steps per unit of time was similar in the two diagnostic groups (n.s. for all protocols). Initial hemodynamic stability (a protocol target) was achieved in 26+/-18 min (mean+/-SD) in cardiac surgery and in 24+/-18 min in septic patients. Sedation targets were reached in 2.4+/-0.2h in cardiac surgery and in 3.6 +/-0.2h in septic patients. Weaning protocol was started in 164 (94%; 154 extubated) cardiac surgery and in 25 (60%; 9 extubated) septic patients. The median (interquartile range) time from starting weaning to extubation (a protocol target) was 89 min (range 44-154 min) for the cardiac surgery patients and 96 min (range 56-205 min) for the septic patients. CONCLUSIONS: Multiple software-driven treatment protocols can be simultaneously applied with high acceptance and rapid achievement of primary treatment goals. Time to reach these primary goals may provide a performance indicator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To what extent is “software engineering” really “engineering” as this term is commonly understood? A hallmark of the products of the traditional engineering disciplines is trustworthiness based on dependability. But in his keynote presentation at ICSE 2006 Barry Boehm pointed out that individuals’, systems’, and peoples’ dependency on software is becoming increasingly critical, yet that dependability is generally not the top priority for software intensive system producers. Continuing in an uncharacteristic pessimistic vein, Professor Boehm said that this situation will likely continue until a major software-induced system catastrophe similar in impact to the 9/11 World Trade Center catastrophe stimulates action toward establishing accountability for software dependability. He predicts that it is highly likely that such a software-induced catastrophe will occur between now and 2025. It is widely understood that software, i.e., computer programs, are intrinsically different from traditionally engineered products, but in one aspect they are identical: the extent to which the well-being of individuals, organizations, and society in general increasingly depend on software. As wardens of the future through our mentoring of the next generation of software developers, we believe that it is our responsibility to at least address Professor Boehm’s predicted catastrophe. Traditional engineering has, and continually addresses its social responsibility through the evolution of the education, practice, and professional certification/licensing of professional engineers. To be included in the fraternity of professional engineers, software engineering must do the same. To get a rough idea of where software engineering currently stands on some of these issues we conducted two surveys. Our main survey was sent to software engineering academics in the U.S., Canada, and Australia. Among other items it sought detail information on their software engineering programs. Our auxiliary survey was sent to U.S. engineering institutions to get some idea about how software engineering programs compared with those in established engineering disciplines of Civil, Electrical, and Mechanical Engineering. Summaries of our findings can be found in the last two sections of our paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reviews Article 6 of the Software Directive and discusses the need for a revision. Beyond clarification of the scope of the very limited provision on reverse engineering, it seems that the introduction of the clause into copyright was unfortunate. The indirect protection of ideas by prohibiting reverse engineering is foreign to the copyright concept. Permitting reverse engineering altogether would promote research and development and further other goals like ICT security. Innovation would not be retarded, which is the reason why US trade secret law permits reverse engineering based also on economic arguments. The notions of compatibility Article 6 tries to address are better dealt with by Competition Law, which was demonstrated by the Microsoft Decision of the European Court in 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, development of information systems (IS) has rapidly changed towards increasing division of labor between firms. Two trends are emerging. First, client companies increasingly outsource software development to external service providers. Second, the formerly oligopolistic enterprise application software industry has started to disintegrate into focal partnership networks – so called platform ecosystems. Despite the increasing prominence of IS outsourcing and platform ecosystems, many of these inter-organizational partnerships fail to achieve expected benefits. Ineffective governance and control frequently plays a pivotal role in producing these failures. While designing effective governance and control mechanisms is always challenging, inter-organizational software development projects are often business-critical and exhibit additional dynamics and uncertainty. As a consequence governance and control have to be adapted over time. The three research projects included in this book provide a better understanding of how and why governance and control can be effectively adapted over time. The implications for successful management of inter-organizational software development projects are highly relevant for theory and practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the availability of lower cost but highly skilled software development labor from offshore regions, entrepreneurs from developed countries who do not have software development experience can utilize this workforce to develop innovative software products. In order to succeed in offshored innovation projects, the often extreme knowledge boundaries between the onsite entrepreneur and the offshore software development team have to be overcome. Prior research has proposed that boundary objects are critical for bridging such boundaries – if they are appropriately used. Our longitudinal, revelatory case study of a software innovation project is one of the first to explore the role of the software prototype as a digital boundary object. Our study empirically unpacks five use practices that transform the software prototype into a boundary object such that knowledge boundaries are bridged. Our findings provide new theoretical insights for literature on software innovation and boundary objects, and have implications for practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge processes are critical to outsourced software projects. According to outsourcing research, outsourced software projects succeed if they manage to integrate the client’s business knowledge and the vendor’s technical knowledge. In this paper, we submit that this view may not be wrong, but incomplete in a significant part of outsourced software work, which is software maintenance. Data from six software-maintenance outsourcing transitions indicate that more important than business or technical knowledge can be application knowledge, which vendor engineers acquire over time during practice. Application knowledge was the dominant knowledge during knowledge transfer activities and its acquisition enabled vendor staff to solve maintenance tasks. We discuss implications for widespread assumptions in outsourcing research.