153 resultados para Tablet computers
Resumo:
Understanding human movement is key to improving input devices and interaction techniques. This paper presents a study of mouse movements of motion-impaired users, with an aim to gaining a better understanding of impaired movement. The cursor trajectories of six motion-impaired users and three able-bodied users are studied according to their submovement structure. Several aspects of the movement are studied, including the frequency and duration of pauses between submovements, verification times, the number of submovements, the peak speed of submovements and the accuracy of submovements in two-dimensions. Results include findings that some motion-impaired users pause more often and for longer than able-bodied users, require up to five times more submovements to complete the same task, and exhibit a correlation between error and peak submovement speed that does not exist for able-bodied users.
Resumo:
An active pharmaceutical ingredient (API) was found to dissociate from the highly crystalline hydrochloride form to the amorphous free base form, with consequent alterations to tablet properties. Here, a wet granulation manufacturing process has been investigated using in situ Fourier transform (FT)-Raman spectroscopic analyses of granules and tablets prepared with different granulating fluids and under different manufacturing conditions. Dosage form stability under a range of storage stresses was also investigated. Despite the spectral similarities between the two drug forms, low levels of API dissociation could be quantified in the tablets; the technique allowed discrimination of around 4% of the API content as the amorphous free base (i.e. less than 1% of the tablet compression weight). API dissociation was shown to be promoted by extended exposure to moisture. Aqueous granulating fluids and manufacturing delays between granulation and drying stages and storage of the tablets in open conditions at 40◦C/75% relative humidity (RH) led to dissociation. In contrast, non-aqueous granulating fluids, with no delay in processing and storage of the tablets in either sealed containers or at lower temperature/humidity prevented detectable dissociation. It is concluded that appropriate manufacturing process and storage conditions for the finished product involved minimising exposure to moisture of the API. Analysis of the drug using FT-Raman spectroscopy allowed rapid optimisation of the process whilst offering quantitative molecular information concerning the dissociation of the drug salt to the amorphous free base form.
Resumo:
In numerical weather prediction (NWP) data assimilation (DA) methods are used to combine available observations with numerical model estimates. This is done by minimising measures of error on both observations and model estimates with more weight given to data that can be more trusted. For any DA method an estimate of the initial forecast error covariance matrix is required. For convective scale data assimilation, however, the properties of the error covariances are not well understood. An effective way to investigate covariance properties in the presence of convection is to use an ensemble-based method for which an estimate of the error covariance is readily available at each time step. In this work, we investigate the performance of the ensemble square root filter (EnSRF) in the presence of cloud growth applied to an idealised 1D convective column model of the atmosphere. We show that the EnSRF performs well in capturing cloud growth, but the ensemble does not cope well with discontinuities introduced into the system by parameterised rain. The state estimates lose accuracy, and more importantly the ensemble is unable to capture the spread (variance) of the estimates correctly. We also find, counter-intuitively, that by reducing the spatial frequency of observations and/or the accuracy of the observations, the ensemble is able to capture the states and their variability successfully across all regimes.
Resumo:
Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.
Resumo:
The fundamental principles of the teaching methodology followed for dyslexic learners evolve around the need for a multisensory approach, which would advocate repetition of learning tasks in an enjoyable way. The introduction of multimedia technologies in the field of education has supported the merging of new tools (digital camera, scanner) and techniques (sounds, graphics, animation) in a meaningful whole. Dyslexic learners are now given the opportunity to express their ideas using these alternative media and participate actively in the educational process. This paper discussed the preliminary findings of a single case study of two English monolingual dyslexic children working together to create an open-ended multimedia project on a laptop computer. The project aimed to examine whether and if the multimedia environment could enhance the dyslexic learners’ skills in composition. Analysis of the data has indicated that the technological facilities gave the children the opportunity to enhance the style and content of their work for a variety of audiences and to develop responsibilities connected to authorship.
Resumo:
A recent paper published in this journal considers the numerical integration of the shallow-water equations using the leapfrog time-stepping scheme [Sun Wen-Yih, Sun Oliver MT. A modified leapfrog scheme for shallow water equations. Comput Fluids 2011;52:69–72]. The authors of that paper propose using the time-averaged height in the numerical calculation of the pressure-gradient force, instead of the instantaneous height at the middle time step. The authors show that this modification doubles the maximum Courant number (and hence the maximum time step) at which the integrations are stable, doubling the computational efficiency. Unfortunately, the pressure-averaging technique proposed by the authors is not original. It was devised and published by Shuman [5] and has been widely used in the atmosphere and ocean modelling community for over 40 years.
Resumo:
Chitosan and its half-acetylated derivative have been compared as excipients in mucoadhesive tablets containing ibuprofen. Initially the powder formulations containing the polymers and the drug were prepared by either co-spray drying or physical co-grinding. Polymer–drug interactions and the degree of drug crystallinity in these formulations were assessed by infrared spectroscopy and differential scanning calorimetry. Tablets were prepared and their swelling and dissolution properties were studied in media of various pHs. Mucoadhesive properties of ibuprofen-loaded and drug-free tablets were evaluated by analysing their detachment from pig gastric mucosa over a range of pHs. Greater polymer–drug interactions were seen for spray-dried particles compared to co-ground samples and drug loading into chitosan-based microparticles (41%) was greater than the corresponding half-acetylated samples (32%). Swelling and drug release was greater with the half-acetylated chitosan tablets than tablets containing the parent polymer and both tablets were mucoadhesive, the extent of which was dependent on substrate pH. The results illustrate the potential sustained drug delivery benefits of both chitosan and its half-acetylated derivative as mucoadhesive tablet excipients.
Resumo:
Polyvinylpyrrolidone is a widely used in tablet formulations with the linear form acting as a wetting agent and disintegrant whereas the cross-linked form is a super-disintegrant. We have previously reported that simply mixing the commercial cross-linked polymer with ibuprofen disrupted drug crystallinity with consequent improvements in drug dissolution behavior. In this study, we have designed and synthesized novel cross-linking agents containing a range of oligoether moieties which have then be polymerized with vinylpyrrolidone to generate a suite of novel excipients with enhanced hydrogen-bonding capabilities. The polymers have a porous surface and swell in most common solvents and in water; properties which suggest their value as disintegrants. The polymers were evaluated in simple physical mixtures with ibuprofen as a model poorly-water soluble drug. The results show that the novel PVPs induce the drug to become “X-ray amorphous”, which increased dissolution to a greater extent than that seen with commercial cross-linked PVP. The polymers stabilize the amorphous drug with no evidence for recrystallization seen after 20 weeks storage.
Resumo:
Experiments demonstrating human enhancement through the implantation of technology in healthy humans have been performed for over a decade by some academic research groups. More recently, technology enthusiasts have begun to realize the potential of implantable technology such as glass capsule RFID transponders. In this paper it is argued that implantable RFID devices have evolved to the point whereby we should consider the devices themselves as simple computers. Presented here is the infection with a computer virus of an RFID device implanted in a human. Coupled with our developing concept of what constitutes the human body and its boundaries, it is argued that this study has given rise to the world’s first human infected with a computer virus. It has taken the wider academic community some time to agree that meaningful discourse on the topic of implantable technology is of value. As developments in medical technologies point to greater possibilities for enhancement, this shift in thinking is not too soon in coming.
Resumo:
This paper proposes and demonstrates an approach, Skilloscopy, to the assessment of decision makers. In an increasingly sophisticated, connected and information-rich world, decision making is becoming both more important and more difficult. At the same time, modelling decision-making on computers is becoming more feasible and of interest, partly because the information-input to those decisions is increasingly on record. The aims of Skilloscopy are to rate and rank decision makers in a domain relative to each other: the aims do not include an analysis of why a decision is wrong or suboptimal, nor the modelling of the underlying cognitive process of making the decisions. In the proposed method a decision-maker is characterised by a probability distribution of their competence in choosing among quantifiable alternatives. This probability distribution is derived by classic Bayesian inference from a combination of prior belief and the evidence of the decisions. Thus, decision-makers’ skills may be better compared, rated and ranked. The proposed method is applied and evaluated in the gamedomain of Chess. A large set of games by players across a broad range of the World Chess Federation (FIDE) Elo ratings has been used to infer the distribution of players’ rating directly from the moves they play rather than from game outcomes. Demonstration applications address questions frequently asked by the Chess community regarding the stability of the Elo rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The method of Skilloscopy may be applied in any decision domain where the value of the decision-options can be quantified.
Resumo:
Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.
Resumo:
A fragmentary tablet from Vindolanda (Tab. Vindol. II, 213) contains an occurrence of the verb interpretari (‘interpret’, ‘explain’, ‘mediate’) in an apparently commercial context, relating to the grain supply for the Roman fort. This usage is paralleled in a text on a wooden stilus tablet from Frisia in the Netherlands. ‘Interpreters’ and their activities make rather infrequent appearances in the Latin epigraphic and documentary records. In the Danubian provinces, interpreters (interpretes) are attested as army officers and officials in the office of the provincial governor. ‘Interpreters’, in both Latin and Greek inscriptions and papyri, often, however, play more ambiguous roles, not always connected with language-mediation, but also, or instead, with mediation in commercial transactions
Resumo:
This is a comprehensive textbook for students of Television Studies, now updated for its third edition. The book provides students with a framework for understanding the key concepts and main approaches to Television Studies, including audience research, television history and broadcasting policy, and the analytical study of individual programmes. The book includes a glossary of key terms used in the television industry and in the academic study of television, there are suggestions for further reading at the end of each chapter, and chapters include suggested activities for use in class or as assignments. The case studies in the book include analysis of advertisements, approaches to news reporting, television scheduling, and challenges to television in new contexts of viewing on computers and mobile devices. The topics of individual chapters are: studying television, television histories, television cultures, television texts and narratives, television genres and formats, television production, television quality and value, television realities and representation, television censorship and regulation, television audiences, and the likely future for television.
Resumo:
In a world where data is captured on a large scale the major challenge for data mining algorithms is to be able to scale up to large datasets. There are two main approaches to inducing classification rules, one is the divide and conquer approach, also known as the top down induction of decision trees; the other approach is called the separate and conquer approach. A considerable amount of work has been done on scaling up the divide and conquer approach. However, very little work has been conducted on scaling up the separate and conquer approach.In this work we describe a parallel framework that allows the parallelisation of a certain family of separate and conquer algorithms, the Prism family. Parallelisation helps the Prism family of algorithms to harvest additional computer resources in a network of computers in order to make the induction of classification rules scale better on large datasets. Our framework also incorporates a pre-pruning facility for parallel Prism algorithms.