14 resultados para Gzip
Resumo:
This report describes recent updates to the custom-built data-acquisition hardware operated by the Center for Hypersonics. In 2006, an ISA-to-USB bridging card was developed as part of Luke Hillyard's final-year thesis. This card allows the hardware to be connected to any recent personal computers via a (USB or RS232) serial port and it provides a number of simple text-based commands for control of the hardware. A graphical user interface program was also updated to help the experimenter manage the data acquisition functions. Sampled data is stored in text files that have been compressed with the gzip for mat. To simplify the later archiving or transport of the data, all files specific to a shot are stored in a single directory. This includes a text file for the run description, the signal configuration file and the individual sampled-data files, one for each signal that was recorded.
Resumo:
En aquest projecte es desenvolupa una aplicació FreeRTOS per a la mota LPC1769 connectada amb un mòdul WiFly que agafa un fitxer d'Internet, el comprimeix i el desa un altre cop a Internet tot afegint les estadístiques bàsiques de temps i percentatge de compressió.
Resumo:
Data traffic caused by mobile advertising client software when it is communicating with the network server can be a pain point for many application developers who are considering advertising-funded application distribution, since the cost of the data transfer might scare their users away from using the applications. For the thesis project, a simulation environment was built to mimic the real client-server solution for measuring the data transfer over varying types of connections with different usage scenarios. For optimising data transfer, a few general-purpose compressors and XML-specific compressors were tried for compressing the XML data, and a few protocol optimisations were implemented. For optimising the cost, cache usage was improved and pre-loading was enhanced to use free connections to load the data. The data traffic structure and the various optimisations were analysed, and it was found that the cache usage and pre-loading should be enhanced and that the protocol should be changed, with report aggregation and compression using WBXML or gzip.
Resumo:
Security and Privacy Online - this is the resource and flyer created for INFO2009 which provides an interactive web presentation to make the general public aware of the dangers of using the internet unsafely, and how they can protect themselves.
Resumo:
BACKGROUND: The aim of this study was to develop a child-specific classification system for long bone fractures and to examine its reliability and validity on the basis of a prospective multicentre study. METHODS: Using the sequentially developed classification system, three samples of between 30 and 185 paediatric limb fractures from a pool of 2308 fractures documented in two multicenter studies were analysed in a blinded fashion by eight orthopaedic surgeons, on a total of 5 occasions. Intra- and interobserver reliability and accuracy were calculated. RESULTS: The reliability improved with successive simplification of the classification. The final version resulted in an overall interobserver agreement of κ = 0.71 with no significant difference between experienced and less experienced raters. CONCLUSIONS: In conclusion, the evaluation of the newly proposed classification system resulted in a reliable and routinely applicable system, for which training in its proper use may further improve the reliability. It can be recommended as a useful tool for clinical practice and offers the option for developing treatment recommendations and outcome predictions in the future.
Resumo:
Systematic reviews and meta-analyses of randomized trials that include patient-reported outcomes (PROs) often provide crucial information for patients, clinicians and policy-makers facing challenging health care decisions. Based on emerging methods, guidance on improving the interpretability of meta-analysis of patient-reported outcomes, typically continuous in nature, is likely to enhance decision-making. The objective of this paper is to summarize approaches to enhancing the interpretability of pooled estimates of PROs in meta-analyses. When differences in PROs between groups are statistically significant, decision-makers must be able to interpret the magnitude of effect. This is challenging when, as is often the case, clinical trial investigators use different measurement instruments for the same construct within and between individual randomized trials. For such cases, in addition to pooling results as a standardized mean difference, we recommend that systematic review authors use other methods to present results such as relative (relative risk, odds ratio) or absolute (risk difference) dichotomized treatment effects, complimented by presentation in either: natural units (e.g. overall depression reduced by 2.4 points when measured on a 50-point Hamilton Rating Scale for Depression); minimal important difference units (e.g. where 1.0 unit represents the smallest difference in depression that patients, on average, perceive as important the depression score was 0.38 (95%CI 0.30 to 0.47) units less than the control group); or a ratio of means (e.g. where the mean in the treatment group is divided by the mean in the control group, the ratio of means is 1.27, representing a 27%relative reduction in the mean depression score).
Resumo:
BACKGROUND: Robot-assisted therapy offers a promising approach to neurorehabilitation, particularly for severely to moderately impaired stroke patients. The objective of this study was to investigate the effects of intensive arm training on motor performance in four chronic stroke patients using the robot ARMin II. METHODS: ARMin II is an exoskeleton robot with six degrees of freedom (DOF) moving shoulder, elbow and wrist joints. Four volunteers with chronic (>or= 12 months post-stroke) left side hemi-paresis and different levels of motor severity were enrolled in the study. They received robot-assisted therapy over a period of eight weeks, three to four therapy sessions per week, each session of one hour.Patients 1 and 4 had four one-hour training sessions per week and patients 2 and 3 had three one-hour training sessions per week. Primary outcome variable was the Fugl-Meyer Score of the upper extremity Assessment (FMA), secondary outcomes were the Wolf Motor Function Test (WMFT), the Catherine Bergego Scale (CBS), the Maximal Voluntary Torques (MVTs) and a questionnaire about ADL-tasks, progress, changes, motivation etc. RESULTS: Three out of four patients showed significant improvements (p < 0.05) in the main outcome. The improvements in the FMA scores were aligned with the objective results of MVTs. Most improvements were maintained or even increased from discharge to the six-month follow-up. CONCLUSION: Data clearly indicate that intensive arm therapy with the robot ARMin II can significantly improve motor function of the paretic arm in some stroke patients, even those in a chronic state. The findings of the study provide a basis for a subsequent controlled randomized clinical trial.
Resumo:
Was the spread of agropastoralism from the Fertile Crescent throughout Europe influenced by rapid climatic shifts? We here generate idealized climate events using palaeoclimate records. In a mathematical model of regional sociocultural development, these events disturb the subsistence base of simulated forager and farmer societies. We evaluate the regional simulated transition timings and durations against a published large set of radiocarbon dates for western Eurasia; the model is able to realistically hindcast much of the inhomogeneous space-time evolution of regional Neolithic transitions. Our study shows that the inclusion of climate events improves the simulation of typical lags between cultural complexes, but that the overall difference to a model without climate events is not significant. Climate events may not have been as important for early sociocultural dynamics as endogenous factors.
Resumo:
The Indus Valley Civilization (IVC) was one of the first great civilizations in prehistory. This bronze age civilization flourished from the end of the fourth millennium BC. It disintegrated during the second millennium BC; despite much research effort, this decline is not well understood. Less research has been devoted to the emergence of the IVC, which shows continuous cultural precursors since at least the seventh millennium BC. To understand the decline, we believe it is necessary to investigate the rise of the IVC, i.e., the establishment of agriculture and livestock, dense populations and technological developments 7000-3000 BC. Although much archaeologically typed information is available, our capability to investigate the system is hindered by poorly resolved chronology, and by a lack of field work in the intermediate areas between the Indus valley and Mesopotamia. We thus employ a complementary numerical simulation to develop a consistent picture of technology, agropastoralism and population developments in the IVC domain. Results from this Global Land Use and technological Evolution Simulator show that there is (1) fair agreement between the simulated timing of the agricultural transition and radiocarbon dates from early agricultural sites, but the transition is simulated first in India then Pakistan; (2) an independent agropas- toralism developing on the Indian subcontinent; and (3) a positive relationship between archeological artifact richness and simulated population density which remains to be quantified.
Bathymetric map of Heron Reef, Australia, derived from airborne hyperspectral data at 1 m resolution
Resumo:
A simple method for efficient inversion of arbitrary radiative transfer models for image analysis is presented. The method operates by representing the shape of the function that maps model parameters to spectral reflectance by an adaptive look-up tree (ALUT) that evenly distributes the discretization error of tabulated reflectances in spectral space. A post-processing step organizes the data into a binary space partitioning tree that facilitates an efficient inversion search algorithm. In an example shallow water remote sensing application, the method performs faster than an implementation of previously published methodology and has the same accuracy in bathymetric retrievals. The method has no user configuration parameters requiring expert knowledge and minimizes the number of forward model runs required, making it highly suitable for routine operational implementation of image analysis methods. For the research community, straightforward and robust inversion allows research to focus on improving the radiative transfer models themselves without the added complication of devising an inversion strategy.
Resumo:
In dieser Arbeit werden dynamisch adaptive Mehrgitterverfahren parallelisiert. Bei dynamisch adaptiven Mehrgitterverfahren wird ein Gebiet mit einem Gitter überdeckt, und auf diesem Gitter wird gerechnet, indem Gitterpunkte in der Umgebung herangezogen werden, um den Wert des nächsten Zeitpunktes zu bestimmen. Dann werden gröbere und feinere Gitter erzeugt und verwendet, wobei die feineren Gitter sich auf Teilgebiete konzentrieren. Diese Teilgebiete ändern sich im Verlauf der Zeit. Durch die Verwendung der zusätzlichen Gitter werden die numerischen Eigenschaften verbessert. Die Parallelisierung solcher Verfahren geschieht in der Regel durch Bisektion. In der vorliegenden Arbeit wird die Umverteilung der Gebiete realisiert, indem Mengen von einzelnen Gitterpunkten verschickt werden. Das ist ein Scheduling-Verfahren. Die Mehrgitterstrukturen sind so aufgebaut, dass fast beliebige Gitterpunktverteilungen auf den Gitterebenen vorliegen können. Die Strukturen werden einmal erzeugt, und nur bei Bedarf geändert, sodass keine Speicherallokationen während der Iterationen nötig sind. Neben dem Gitter sind zusätzliche Strukturen, wie zum Beispiel die Randstrukturen, erforderlich. Eine Struktur Farbenfeld verzeichnet, auf welchem Kern sich ein Außenrandpunkt befindet. In der parallelen adaptiven Verfeinerung werden für einzelne durch ein Entscheidungskriterium ausgewählte Gitterpunkte 5 x 5 Punktüberdeckungen vorgenommen. Dazu werden die verfügbaren Entscheidungsinformationen zur Bestimmung von komplexeren Strukturen herangezogen. Damit muss das Verfeinerungsgitter nicht komplett abgebaut und dann wieder aufgebaut werden, sondern nur die Änderungen am Gitter sind vorzunehmen. Das spart viel Berechnungszeit. Der letzte Schritt besteht darin, den Lastausgleich durchzuführen. Zunächst werden die Lasttransferwerte bestimmt, die angeben, wie viele Gitterpunkte von wo nach wo zu verschicken sind. Das geschieht mit Hilfe einer PLB genannten Methode bzw. einer Variante. PLB wurde bisher vor allem für kombinatorische Probleme eingesetzt. Dann erfolgt eine Auswahl der zu verschickenden Gitterpunkte mit einer Strategie, welche Punkte eines Kerns zu welchen Nachbarkernen transferiert werden sollen. Im letzten Schritt werden schließlich die ausgewählten Punkte migriert, wobei alle Gitterpunktstrukturen umgebaut werden und solche Informationen gepackt werden müssen, sodass ein Umbau seiner Gitterpunktstrukturen bei dem Empfänger möglich wird. Neben den Gitterpunktstrukturen müssen auch Strukturen für die parallele adaptive Verfeinerung verändert werden. Es muss ein Weiterverschicken von Gitterpunkten möglich sein, wenn über die Lastkanten in mehreren Runden Last verschickt wird. Während des Lastausgleichs wird noch Arbeit durch eine Struktur Zwischenkorrektur durchgeführt, die es ermöglicht, das Farbenfeld intakt zu halten, wenn benachbarte Gitterpunkte gleichzeitig verschickt werden.