955 resultados para Open Library Environment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of maturity of software is an important area in the general software sector. The field of OSS also applies various models to measure software maturity. However, measuring maturity of OSS being used for several applications in libraries is an area left with no research so far. This study has attempted to fill the research gap. Measuring maturity of software contributes knowledge on its sustainability over the long term. Maturity of software is one of the factors that positively influence adoption. The investigator measured the maturity of DSpace software using Woods and Guliani‟s Open Source Maturity Model-2005. The present study is significant as it addresses the aspects of maturity of OSS for libraries and fills the research gap on the area. In this sense the study opens new avenues to the field of library and information science by providing an additional tool for librarians in the selection and adoption of OSS. Measuring maturity brings in-depth knowledge on an OSS which will contribute towards the perceived usefulness and perceived ease of use as explained in the Technology Acceptance Model theory.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Comparing the experiences of selected Latin America and the Caribbean countries and their trajectories over the past 15 years offers rich insights into the dynamics and causes for not meeting the 2015 MDGs. They also offer clues for post-MDG strategies. Central to achieving sustainable growth are government policies able to support small and medium-sized farms and peasants, as they are crucial for the achievement of several goals, centrally: to achieve food security; to provide a sound and stable rural environment able to resist external (financial) shocks; to secure healthy food; to secure local food; and to protect vibrant and culturally rich local communities. This paper analyses and compares the most successful government policies to the least successful policies carried out over the last 15 years in selected Latin American and Caribbean countries and based on this analysis, offers strategies for more promising post-MDG politics, able to reduce poverty, reduce inequality, fight back informality, and achieve more decent work in poor countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are currently at the cusp of a revolution in quantum technology that relies not just on the passive use of quantum effects, but on their active control. At the forefront of this revolution is the implementation of a quantum computer. Encoding information in quantum states as “qubits” allows to use entanglement and quantum superposition to perform calculations that are infeasible on classical computers. The fundamental challenge in the realization of quantum computers is to avoid decoherence – the loss of quantum properties – due to unwanted interaction with the environment. This thesis addresses the problem of implementing entangling two-qubit quantum gates that are robust with respect to both decoherence and classical noise. It covers three aspects: the use of efficient numerical tools for the simulation and optimal control of open and closed quantum systems, the role of advanced optimization functionals in facilitating robustness, and the application of these techniques to two of the leading implementations of quantum computation, trapped atoms and superconducting circuits. After a review of the theoretical and numerical foundations, the central part of the thesis starts with the idea of using ensemble optimization to achieve robustness with respect to both classical fluctuations in the system parameters, and decoherence. For the example of a controlled phasegate implemented with trapped Rydberg atoms, this approach is demonstrated to yield a gate that is at least one order of magnitude more robust than the best known analytic scheme. Moreover this robustness is maintained even for gate durations significantly shorter than those obtained in the analytic scheme. Superconducting circuits are a particularly promising architecture for the implementation of a quantum computer. Their flexibility is demonstrated by performing optimizations for both diagonal and non-diagonal quantum gates. In order to achieve robustness with respect to decoherence, it is essential to implement quantum gates in the shortest possible amount of time. This may be facilitated by using an optimization functional that targets an arbitrary perfect entangler, based on a geometric theory of two-qubit gates. For the example of superconducting qubits, it is shown that this approach leads to significantly shorter gate durations, higher fidelities, and faster convergence than the optimization towards specific two-qubit gates. Performing optimization in Liouville space in order to properly take into account decoherence poses significant numerical challenges, as the dimension scales quadratically compared to Hilbert space. However, it can be shown that for a unitary target, the optimization only requires propagation of at most three states, instead of a full basis of Liouville space. Both for the example of trapped Rydberg atoms, and for superconducting qubits, the successful optimization of quantum gates is demonstrated, at a significantly reduced numerical cost than was previously thought possible. Together, the results of this thesis point towards a comprehensive framework for the optimization of robust quantum gates, paving the way for the future realization of quantum computers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since no physical system can ever be completely isolated from its environment, the study of open quantum systems is pivotal to reliably and accurately control complex quantum systems. In practice, reliability of the control field needs to be confirmed via certification of the target evolution while accuracy requires the derivation of high-fidelity control schemes in the presence of decoherence. In the first part of this thesis an algebraic framework is presented that allows to determine the minimal requirements on the unique characterisation of arbitrary unitary gates in open quantum systems, independent on the particular physical implementation of the employed quantum device. To this end, a set of theorems is devised that can be used to assess whether a given set of input states on a quantum channel is sufficient to judge whether a desired unitary gate is realised. This allows to determine the minimal input for such a task, which proves to be, quite remarkably, independent of system size. These results allow to elucidate the fundamental limits regarding certification and tomography of open quantum systems. The combination of these insights with state-of-the-art Monte Carlo process certification techniques permits a significant improvement of the scaling when certifying arbitrary unitary gates. This improvement is not only restricted to quantum information devices where the basic information carrier is the qubit but it also extends to systems where the fundamental informational entities can be of arbitary dimensionality, the so-called qudits. The second part of this thesis concerns the impact of these findings from the point of view of Optimal Control Theory (OCT). OCT for quantum systems utilises concepts from engineering such as feedback and optimisation to engineer constructive and destructive interferences in order to steer a physical process in a desired direction. It turns out that the aforementioned mathematical findings allow to deduce novel optimisation functionals that significantly reduce not only the required memory for numerical control algorithms but also the total CPU time required to obtain a certain fidelity for the optimised process. The thesis concludes by discussing two problems of fundamental interest in quantum information processing from the point of view of optimal control - the preparation of pure states and the implementation of unitary gates in open quantum systems. For both cases specific physical examples are considered: for the former the vibrational cooling of molecules via optical pumping and for the latter a superconducting phase qudit implementation. In particular, it is illustrated how features of the environment can be exploited to reach the desired targets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compositional data naturally arises from the scientific analysis of the chemical composition of archaeological material such as ceramic and glass artefacts. Data of this type can be explored using a variety of techniques, from standard multivariate methods such as principal components analysis and cluster analysis, to methods based upon the use of log-ratios. The general aim is to identify groups of chemically similar artefacts that could potentially be used to answer questions of provenance. This paper will demonstrate work in progress on the development of a documented library of methods, implemented using the statistical package R, for the analysis of compositional data. R is an open source package that makes available very powerful statistical facilities at no cost. We aim to show how, with the aid of statistical software such as R, traditional exploratory multivariate analysis can easily be used alongside, or in combination with, specialist techniques of compositional data analysis. The library has been developed from a core of basic R functionality, together with purpose-written routines arising from our own research (for example that reported at CoDaWork'03). In addition, we have included other appropriate publicly available techniques and libraries that have been implemented in R by other authors. Available functions range from standard multivariate techniques through to various approaches to log-ratio analysis and zero replacement. We also discuss and demonstrate a small selection of relatively new techniques that have hitherto been little-used in archaeometric applications involving compositional data. The application of the library to the analysis of data arising in archaeometry will be demonstrated; results from different analyses will be compared; and the utility of the various methods discussed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El entorno empresarial actual ha llevado a las empresas a adoptar nuevas estrategias para ingresar a nuevos mercados, logrando ser competitivos y productivos. Una de esas estrategias es la creación de las redes empresariales, que permite a las empresas apoyarse en otras con las que tienen un objetivo común. La asociatividad se ha consolidado como la forma más sencilla y practica de alcanzar mercados nuevos sin incurrir en muchos gastos o riesgos, sin que ello implique que los riesgos o gastos desaparezcan, si disminuyen de forma significativa. Por otro lado, las empresas también buscan ser saludables, esto a través de la implementación de programas y la adopción de nuevos criterios que les permita ser más competitivas y la vez poder aportar a sus empleados, sus proveedores y la comunidad. Buscan ser organizaciones que destaquen los aspectos positivos en sus organizaciones y mejorar las relaciones en todos los niveles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El fin de la Guerra Fría supuso no sólo el triunfo del capitalismo y de la democracia liberal, sino un cambio significativo en el Sistema Internacional; siendo menos centralizado y más regionalizado, como consecuencia de la proximidad y relaciones de interdependencia entre sus actores (no sólo Estados) y permitiendo la formación de Complejos Regionales de Seguridad (CRS). Los CRS son una forma efectiva de relacionarse y aproximarse a la arena internacional pues a través de sus procesos de securitización y desecuritización consiguen lograr objetivos específicos. Partiendo de ello, tanto la Unión Europea (UE) como la Comunidad para el Desarrollo de África Austral (SADC) iniciaron varios procesos de securitización relacionados con la integración regional; siendo un ejemplo de ello la eliminación de los controles en sus fronteras interiores o libre circulación de personas; pues consideraron que de no hacerse realidad, ello generaría amenazas políticas (su influencia y capacidad de actuación estaban amenazadas), económicas (en cuanto a su competitividad y niveles básicos de bienestar) y societales (en cuanto a la identidad de la comunidad como indispensable para la integración) que pondrían en riesgo la existencia misma de sus CRS. En esta medida, la UE creó el Espacio Schengen, que fue producto de un proceso de securitización desde inicios de la década de los 80 hasta mediados de la década de los 90; y la SADC se encuentra inmersa en tal proceso de securitización desde 1992 hasta la actualidad y espera la ratificación del Protocolo para la Facilitación del Movimiento de personas como primer paso para lograr la eliminación de controles en sus fronteras interiores. Si bien tanto la UE como la SADC consideraron que de no permitir la libre circulación de personas, su integración y por lo tanto, sus CRS estaban en riesgo; la SADC no lo ha logrado. Ello hace indispensable hacer un análisis más profundo de sus procesos de securitización para así encontrar sus falencias con respecto al éxito de la UE. El análisis está basado en la Teoría de los Complejos de Seguridad de Barry Buzan, plasmada en la obra Security a New Framework for Analysis (1998) de Barry Buzan, Ole Waever y Jaap de Wilde y será dividido en cada una de las etapas del proceso de securitización: la identificación de una amenaza existencial a un objeto referente a través de un acto discursivo, la aceptación de una amenaza por parte de una audiencia relevante y las acciones de emergencia para hacer frente a las amenazas existenciales; reconociendo las diferencias y similitudes de un proceso de securitización exitoso frente a otro que aún no lo ha sido.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El presente caso de estudio tiene como objetivo explicar el rol de la cooperación internacional para el desarrollo en Tanzania, Mozambique, Nigeria para la consolidación del liderazgo político internacional de Japón. El interés de realizar esta investigación es la ampliación del conocimiento sobre el uso del poder blando, para alcanzar los objetivos de política exterior japonesa. Por eso, se llevara a cabo una revisión bibliográfica para el análisis de documentos oficiales y artículos académicos para la consolidación de información. A partir de ello, se pretende demostrar que la cooperación al desarrollo es una herramienta de política exterior japonesa para consolidarse como líder, en la medida en que el uso de herramientas propias de la cooperación y el presupuesto destinada a la ejecución de éstas tienen incidencia en los votos de estos Estados africanos para las iniciativas japonesas en las Naciones Unidas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cybersecurity is a complex challenge that has emerged alongside the evolving global socio-technical environment of social networks that feature connectivity across time and space in ways unimaginable even a decade ago. This paper reports on the preliminary findings of a NATO funded project that investigates the nature of innovation in open collaborative communities and its implications for cyber security. In this paper, the authors describe the framing of relevant issues, the articulation of the research questions, and the derivation of a conceptual framework based on open collaborative innovation that has emerged from preliminary field research in Russia and the UK.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reducing carbon conversion of ruminally degraded feed into methane increases feed efficiency and reduces emission of this potent greenhouse gas into the environment. Accurate, yet simple, predictions of methane production of ruminants on any feeding regime are important in the nutrition of ruminants, and in modeling methane produced by them. The current work investigated feed intake, digestibility and methane production by open-circuit respiration measurements in sheep fed 15 untreated, sodium hydroxide (NaOH) treated and anhydrous ammonia (NH3) treated wheat, barley and oat straws. In vitro fermentation characteristics of straws were obtained from incubations using the Hohenheim gas production system that measured gas production, true substrate degradability, short-chain fatty acid production and efficiency of microbial production from the ratio of truly degraded substrate to gas volume. In the 15 straws, organic matter (OM) intake and in vivo OM digestibility ranged from 563 to 1201 g and from 0.464 to 0.643, respectively. Total daily methane production ranged from 13.0 to 34.4 l, whereas methane produced/kg OM matter apparently digested in vivo varied from 35.0 to 61.8 l. The OM intake was positively related to total methane production (R2 = 0.81, P<0.0001), and in vivo OM digestibility was also positively associated with methane production (R2 = 0.67, P<0.001), but negatively associated with methane production/kg digestible OM intake (R2 = 0.61, P<0.001). In the in vitro incubations of the 15 straws, the ratio of acetate to propionate ranged from 2.3 to 2.8 (P<0.05) and efficiencies of microbial production ranged from 0.21 to 0.37 (P<0.05) at half asymptotic gas production. Total daily methane production, calculated from in vitro fermentation characteristics (i.e., true degradability, SCFA ratio and efficiency of microbial production) and OM intake, compared well with methane measured in the open-circuit respiration chamber (y = 2.5 + 0.86x, R2 = 0.89, P<0.0001, Sy.x = 2.3). Methane production from forage fed ruminants can be predicted accurately by simple in vitro incubations combining true substrate degradability and gas volume measurements, if feed intake is known.