980 resultados para software-defined storage
Resumo:
Este trabajo desarrolla una aplicación basada en la tecnología Android para la atención de clientes en despachos de abogados.
Resumo:
A partir dels requeriments, definits per la Universitat de Lleida, es proposa una implementació d'una eina de suport basada en un producte de programari lliure anomenat GLPI. El procés d'implementació passa per la parametrització de GLPI al model requerit, mirant d'evitar la necessitat de modificar el codi del sistema escollit.
Resumo:
The determination of sediment storage is a critical parameter in sediment budget analyses. But, in many sediment budget studies the quantification of magnitude and time-scale of sediment storage is still the weakest part and often relies on crude estimations only, especially in large drainage basins (>100km2). We present a new approach to storage quantification in a meso-scale alpine catchment of the Swiss Alps (Turtmann Valley, 110km2). The quantification of depositional volumes was performed by combining geophysical surveys and geographic information system (GIS) modelling techniques. Mean thickness values of each landform type calculated from these data was used to estimate the sediment volume in the hanging valleys and the trough slopes. Sediment volume of the remaining subsystems was determined by modelling an assumed parabolic bedrock surface using digital elevation model (DEM) data. A total sediment volume of 781·3×106?1005·7×106m3 is deposited in the Turtmann Valley. Over 60% of this volume is stored in the 13 hanging valleys. Moraine landforms contain over 60% of the deposits in the hanging valleys followed by sediment stored on slopes (20%) and rock glaciers (15%). For the first time, a detailed quantification of different storage types was achieved in a catchment of this size. Sediment volumes have been used to calculate mean denudation rates for the different processes ranging from 0·1 to 2·6mm/a based on a time span of 10ka. As the quantification approach includes a number of assumptions and various sources of error the values given represent the order of magnitude of sediment storage that has to be expected in a catchment of this size.
Resumo:
Background: The DEFUSE (n_74) and EPITHET (n_101) studies have in common that a baseline MRI was obtained prior to treatment (tPA in DEFUSE; tPA or placebo in EPITHET) in the 3-6 hour time-window. There were however important methodological differences between the studies. A standardized reanalysis of pooled data was undertaken to determine the effect of these differences on baseline characteristics and study outcomes. Methods: To standardize the studies 1) the DWI and PWI source images were reprocessed and segmented using automated image processing software (RAPID); 2) patients were categorized according to their baseline MRI profile as either Target Mismatch (PWITmax_6/DWI ratio_ 1.8 and an absolute mismatch _15mL), Malignant (DWI or PWITmax_10 lesion _ 100 mL), or No Mismatch. 3) favorable clinical response was defined as NIHSS score of 0-1 or a _8 points improvement on the NIHSSS at day 90. Results: Prior to standardization there was no difference in the proportion of Target Mismatch patients between EPITHET and DEFUSE (54% vs 49%, p_0.6), but the EPITHET study had more patients with the Malignant profile than DEFUSE (35% vs 9%, p_0.01) and fewer patients that had No Mismatch (11% vs 42%, p_0.01). These differences in baseline MRI profiles between EPITHET and DEFUSE were largely eliminated by standardized processing of PWI and DWI images with RAPID software (Target Mismatch 49% vs 48%; Malignant 15% vs 8%; No Mismatch 36% vs 25%; p_NS for all comparisons) Reperfusion was strongly associated with a favorable clinical response in mismatch patients (figure). This relationship was not affected by the standardization procedures (pooled odds ratio of 8.8 based on original data and 6.6 based on standardized data). Conclusion: Standardization of image analyses procedures in acute stroke is important as non-standardized techniques introduce significant variability in DWI and PWI imaging characteristics. Despite methodological differences, the DEFUSE and EPITHET studies show a consistent and robust association between reperfusion and favorable clinical response in Target Mismatch patients regardless of standardization. These data support an RCT of iv tPA in the 3-6 hour time-window for Target Mismatch patients identified using RAPID.
Resumo:
- The objective of this work was to determine the total protein profile and the contents of the four major protein fractions (albumin, globulin, prolamin and glutelin) and of the amino acids in the endosperm of the rice wild species Oryza glumaepatula. The experiment was performed with 29 accessions of this species, collected from 13 Brazilian locations, and two commercial cultivars. Protein samples were prepared using dried, polished, and ground grains to obtain homogeneous, dry flour used in the preparation of extracts. Oryza glumaepatula accessions were identified with the highest levels of total protein, albumin and glutelin protein fractions, and amino acids (with the exception of tryptophan) in comparison to the two analized rice cultivars. The albumin and glutelin profiles in SDS-Page were distinct between rice cultivars and O. glumaepatula. This wild species has the potential to increase the nutritional quality of rice storage protein through interspecific crosses.
Resumo:
Xerrada de cloenda de la Setmana internacional d'accés obert 2011 a la UOC, a càrrec de l'advocat Josep Jover. Per què les estratègies altruistes guanyen les egoistes en el programari lliure i en el #15m? El moviment #15m, igual que el programari, a diferència dels béns materials, no es pot posseir, ja que en pot gaudir (formant-ne part) un nombre indeterminat de persones sense que per això hagi de privar ningú de tenir-lo al seu torn. I això porta a girar com un mitjó la manera com manegen la informació les universitats, i quina és la missió de la universitat en la nova societat. En el futur immediat, valorarem les universitats no per la informació que guarden, que fora sempre serà millor i més extensa, sinó per la capacitat de crear masses crítiques, sia de recerca de coneixement, de capacitació humana, d'enllaç entre iguals... Les universitats hauran d'implantar el model o quedaran relegades.
Resumo:
Closing talk of the Open Access Week 2011 at the UOC, by Josep Jover. Why do altruistic strategies beat selfish ones in the spheres of both free software and the #15m movement? The #15m movement, like software but unlike tangible goods, cannot be owned. It can be used (by joining it) by an indeterminate number of people without depriving anyone else of the chance to do the same. And that turns everything on its head: how universities manage information and what their mission is in this new society. In the immediate future, universities will be valued not for the information they harbour, which will always be richer and more extensive beyond their walls, but rather for their capacity to create critical masses, whether of knowledge research, skill-building, or networks of peers... universities must implement the new model or risk becoming obsolete.
Resumo:
The reason for this study is to propose a new quantitative approach on how to assess the quality of Open Access University Institutional Repositories. The results of this new approach are tested in the Spanish University Repositories. The assessment method is based in a binary codification of a proposal of features that objectively describes the repositories. The purposes of this method are assessing the quality and an almost automatically system for updating the data of the characteristics. First of all a database was created with the 38 Spanish institutional repositories. The variables of analysis are presented and explained either if they are coming from bibliography or are a set of new variables. Among the characteristics analyzed are the features of the software, the services of the repository, the features of the information system, the Internet visibility and the licenses of use. Results from Spanish universities ARE provided as a practical example of the assessment and for having a picture of the state of the development of the open access movement in Spain.
Resumo:
BACKGROUND: Current bilevel positive-pressure ventilators for home noninvasive ventilation (NIV) provide physicians with software that records items important for patient monitoring, such as compliance, tidal volume (Vt), and leaks. However, to our knowledge, the validity of this information has not yet been independently assessed. METHODS: Testing was done for seven home ventilators on a bench model adapted to simulate NIV and generate unintentional leaks (ie, other than of the mask exhalation valve). Five levels of leaks were simulated using a computer-driven solenoid valve (0-60 L/min) at different levels of inspiratory pressure (15 and 25 cm H(2)O) and at a fixed expiratory pressure (5 cm H(2)O), for a total of 10 conditions. Bench data were compared with results retrieved from ventilator software for leaks and Vt. RESULTS: For assessing leaks, three of the devices tested were highly reliable, with a small bias (0.3-0.9 L/min), narrow limits of agreement (LA), and high correlations (R(2), 0.993-0.997) when comparing ventilator software and bench results; conversely, for four ventilators, bias ranged from -6.0 L/min to -25.9 L/min, exceeding -10 L/min for two devices, with wide LA and lower correlations (R(2), 0.70-0.98). Bias for leaks increased markedly with the importance of leaks in three devices. Vt was underestimated by all devices, and bias (range, 66-236 mL) increased with higher insufflation pressures. Only two devices had a bias < 100 mL, with all testing conditions considered. CONCLUSIONS: Physicians monitoring patients who use home ventilation must be aware of differences in the estimation of leaks and Vt by ventilator software. Also, leaks are reported in different ways according to the device used.
Resumo:
Open source is typically outside of normal commercial software procurement processes.The Challenges.Increasingly diverse and distributed set of development resources.Little/no visibility into the origins of the software.Supply Chain Comparison: Hardware vs Software.Open source has revolutionized the mobile and device landscape, other industries will follow.Supply chain management techniques from hardware are useful for managing software.SPDX A standard format for communicating a software Bill of Materials across the supply chain.Effective management and control requires training, tools, processes and standards.
Resumo:
The use of open source software continues to grow on a daily basis. Today, enterprise applications contain 40% to 70% open source code and this fact has legal, development, IT security, risk management and compliance organizations focusing their attention on its use, as never before. They increasingly understand that the open source content within an application must be detected. Once uncovered, decisions regarding compliance with intellectual property licensing obligations must be made and known security vulnerabilities must be remediated. It is no longer sufficient from a risk perspective to not address both open source issues.
Resumo:
Industry and large Agencies needs ¿agile¿ programming resources, to reinforce their own development staff and take advantage of innovative approaches produced by ¿fresh minds¿ all over the world. At the same time they may be reluctant to engage in classical software development call for tenders and contracts. Such contracts are often ¿trusted¿ by large ICT firms, which will deliver according to their own rigid frameworks (often based on alliances with proprietary software vendors), may propose comfortable quality assurances, but will cover their (real) risks and liability with high contingency costs and will charge for any change request in case the original specifications have not fixed all possible issues. Introducing FLOSS in business implies a new contracting philosophy, based on incentives rather than penalties and liability. Based on 2011 experience with a large Space Agency, Patrice-Emmanuel Schmitz pictures the needed legal instruments for a novel approach.
Resumo:
The Free Open Source Software (FOSS) seem far from the military field but in some cases, some technologies normally used for civilian purposes may have military applications. These products and technologies are called dual-use. Can we manage to combine FOSS and dual-use products? On one hand, we have to admit that this kind of association exists - dual-use software can be FOSS and many examples demonstrate this duality - but on the other hand, dual-use software available under free licenses lead us to ask many questions. For example, the dual-use export control laws aimed at stemming the proliferation of weapons of mass destruction. Dual-use export in United States (ITAR) and Europe (regulation 428/2009) implies as a consequence the prohibition or regulation of software exportation, involving the closing of source code. Therefore, the issues of exported softwares released under free licenses arises. If software are dual-use goods and serve for military purposes, they may represent a danger. By the rights granted to licenses to run, study, redistribute and distribute modified versions of the software, anyone can access the free dual-use software. So, the licenses themselves are not at the origin of the risk, it is actually linked to the facilitated access to source codes. Seen from this point of view, it goes against the dual-use regulation which allows states to control these technologies exportation. For this analysis, we will discuss about various legal questions and draft answers from either licenses or public policies in this respect.