962 resultados para Tablet computers
Resumo:
An active pharmaceutical ingredient (API) was found to dissociate from the highly crystalline hydrochloride form to the amorphous free base form, with consequent alterations to tablet properties. Here, a wet granulation manufacturing process has been investigated using in situ Fourier transform (FT)-Raman spectroscopic analyses of granules and tablets prepared with different granulating fluids and under different manufacturing conditions. Dosage form stability under a range of storage stresses was also investigated. Despite the spectral similarities between the two drug forms, low levels of API dissociation could be quantified in the tablets; the technique allowed discrimination of around 4% of the API content as the amorphous free base (i.e. less than 1% of the tablet compression weight). API dissociation was shown to be promoted by extended exposure to moisture. Aqueous granulating fluids and manufacturing delays between granulation and drying stages and storage of the tablets in open conditions at 40◦C/75% relative humidity (RH) led to dissociation. In contrast, non-aqueous granulating fluids, with no delay in processing and storage of the tablets in either sealed containers or at lower temperature/humidity prevented detectable dissociation. It is concluded that appropriate manufacturing process and storage conditions for the finished product involved minimising exposure to moisture of the API. Analysis of the drug using FT-Raman spectroscopy allowed rapid optimisation of the process whilst offering quantitative molecular information concerning the dissociation of the drug salt to the amorphous free base form.
Resumo:
In numerical weather prediction (NWP) data assimilation (DA) methods are used to combine available observations with numerical model estimates. This is done by minimising measures of error on both observations and model estimates with more weight given to data that can be more trusted. For any DA method an estimate of the initial forecast error covariance matrix is required. For convective scale data assimilation, however, the properties of the error covariances are not well understood. An effective way to investigate covariance properties in the presence of convection is to use an ensemble-based method for which an estimate of the error covariance is readily available at each time step. In this work, we investigate the performance of the ensemble square root filter (EnSRF) in the presence of cloud growth applied to an idealised 1D convective column model of the atmosphere. We show that the EnSRF performs well in capturing cloud growth, but the ensemble does not cope well with discontinuities introduced into the system by parameterised rain. The state estimates lose accuracy, and more importantly the ensemble is unable to capture the spread (variance) of the estimates correctly. We also find, counter-intuitively, that by reducing the spatial frequency of observations and/or the accuracy of the observations, the ensemble is able to capture the states and their variability successfully across all regimes.
Resumo:
Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.
Resumo:
The fundamental principles of the teaching methodology followed for dyslexic learners evolve around the need for a multisensory approach, which would advocate repetition of learning tasks in an enjoyable way. The introduction of multimedia technologies in the field of education has supported the merging of new tools (digital camera, scanner) and techniques (sounds, graphics, animation) in a meaningful whole. Dyslexic learners are now given the opportunity to express their ideas using these alternative media and participate actively in the educational process. This paper discussed the preliminary findings of a single case study of two English monolingual dyslexic children working together to create an open-ended multimedia project on a laptop computer. The project aimed to examine whether and if the multimedia environment could enhance the dyslexic learners’ skills in composition. Analysis of the data has indicated that the technological facilities gave the children the opportunity to enhance the style and content of their work for a variety of audiences and to develop responsibilities connected to authorship.
Resumo:
A recent paper published in this journal considers the numerical integration of the shallow-water equations using the leapfrog time-stepping scheme [Sun Wen-Yih, Sun Oliver MT. A modified leapfrog scheme for shallow water equations. Comput Fluids 2011;52:69–72]. The authors of that paper propose using the time-averaged height in the numerical calculation of the pressure-gradient force, instead of the instantaneous height at the middle time step. The authors show that this modification doubles the maximum Courant number (and hence the maximum time step) at which the integrations are stable, doubling the computational efficiency. Unfortunately, the pressure-averaging technique proposed by the authors is not original. It was devised and published by Shuman [5] and has been widely used in the atmosphere and ocean modelling community for over 40 years.
Resumo:
Chitosan and its half-acetylated derivative have been compared as excipients in mucoadhesive tablets containing ibuprofen. Initially the powder formulations containing the polymers and the drug were prepared by either co-spray drying or physical co-grinding. Polymer–drug interactions and the degree of drug crystallinity in these formulations were assessed by infrared spectroscopy and differential scanning calorimetry. Tablets were prepared and their swelling and dissolution properties were studied in media of various pHs. Mucoadhesive properties of ibuprofen-loaded and drug-free tablets were evaluated by analysing their detachment from pig gastric mucosa over a range of pHs. Greater polymer–drug interactions were seen for spray-dried particles compared to co-ground samples and drug loading into chitosan-based microparticles (41%) was greater than the corresponding half-acetylated samples (32%). Swelling and drug release was greater with the half-acetylated chitosan tablets than tablets containing the parent polymer and both tablets were mucoadhesive, the extent of which was dependent on substrate pH. The results illustrate the potential sustained drug delivery benefits of both chitosan and its half-acetylated derivative as mucoadhesive tablet excipients.
Resumo:
Polyvinylpyrrolidone is a widely used in tablet formulations with the linear form acting as a wetting agent and disintegrant whereas the cross-linked form is a super-disintegrant. We have previously reported that simply mixing the commercial cross-linked polymer with ibuprofen disrupted drug crystallinity with consequent improvements in drug dissolution behavior. In this study, we have designed and synthesized novel cross-linking agents containing a range of oligoether moieties which have then be polymerized with vinylpyrrolidone to generate a suite of novel excipients with enhanced hydrogen-bonding capabilities. The polymers have a porous surface and swell in most common solvents and in water; properties which suggest their value as disintegrants. The polymers were evaluated in simple physical mixtures with ibuprofen as a model poorly-water soluble drug. The results show that the novel PVPs induce the drug to become “X-ray amorphous”, which increased dissolution to a greater extent than that seen with commercial cross-linked PVP. The polymers stabilize the amorphous drug with no evidence for recrystallization seen after 20 weeks storage.
Resumo:
Experiments demonstrating human enhancement through the implantation of technology in healthy humans have been performed for over a decade by some academic research groups. More recently, technology enthusiasts have begun to realize the potential of implantable technology such as glass capsule RFID transponders. In this paper it is argued that implantable RFID devices have evolved to the point whereby we should consider the devices themselves as simple computers. Presented here is the infection with a computer virus of an RFID device implanted in a human. Coupled with our developing concept of what constitutes the human body and its boundaries, it is argued that this study has given rise to the world’s first human infected with a computer virus. It has taken the wider academic community some time to agree that meaningful discourse on the topic of implantable technology is of value. As developments in medical technologies point to greater possibilities for enhancement, this shift in thinking is not too soon in coming.
Resumo:
This paper proposes and demonstrates an approach, Skilloscopy, to the assessment of decision makers. In an increasingly sophisticated, connected and information-rich world, decision making is becoming both more important and more difficult. At the same time, modelling decision-making on computers is becoming more feasible and of interest, partly because the information-input to those decisions is increasingly on record. The aims of Skilloscopy are to rate and rank decision makers in a domain relative to each other: the aims do not include an analysis of why a decision is wrong or suboptimal, nor the modelling of the underlying cognitive process of making the decisions. In the proposed method a decision-maker is characterised by a probability distribution of their competence in choosing among quantifiable alternatives. This probability distribution is derived by classic Bayesian inference from a combination of prior belief and the evidence of the decisions. Thus, decision-makers’ skills may be better compared, rated and ranked. The proposed method is applied and evaluated in the gamedomain of Chess. A large set of games by players across a broad range of the World Chess Federation (FIDE) Elo ratings has been used to infer the distribution of players’ rating directly from the moves they play rather than from game outcomes. Demonstration applications address questions frequently asked by the Chess community regarding the stability of the Elo rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The method of Skilloscopy may be applied in any decision domain where the value of the decision-options can be quantified.
Resumo:
Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.
Resumo:
A fragmentary tablet from Vindolanda (Tab. Vindol. II, 213) contains an occurrence of the verb interpretari (‘interpret’, ‘explain’, ‘mediate’) in an apparently commercial context, relating to the grain supply for the Roman fort. This usage is paralleled in a text on a wooden stilus tablet from Frisia in the Netherlands. ‘Interpreters’ and their activities make rather infrequent appearances in the Latin epigraphic and documentary records. In the Danubian provinces, interpreters (interpretes) are attested as army officers and officials in the office of the provincial governor. ‘Interpreters’, in both Latin and Greek inscriptions and papyri, often, however, play more ambiguous roles, not always connected with language-mediation, but also, or instead, with mediation in commercial transactions
Resumo:
This is a comprehensive textbook for students of Television Studies, now updated for its third edition. The book provides students with a framework for understanding the key concepts and main approaches to Television Studies, including audience research, television history and broadcasting policy, and the analytical study of individual programmes. The book includes a glossary of key terms used in the television industry and in the academic study of television, there are suggestions for further reading at the end of each chapter, and chapters include suggested activities for use in class or as assignments. The case studies in the book include analysis of advertisements, approaches to news reporting, television scheduling, and challenges to television in new contexts of viewing on computers and mobile devices. The topics of individual chapters are: studying television, television histories, television cultures, television texts and narratives, television genres and formats, television production, television quality and value, television realities and representation, television censorship and regulation, television audiences, and the likely future for television.
Resumo:
In a world where data is captured on a large scale the major challenge for data mining algorithms is to be able to scale up to large datasets. There are two main approaches to inducing classification rules, one is the divide and conquer approach, also known as the top down induction of decision trees; the other approach is called the separate and conquer approach. A considerable amount of work has been done on scaling up the divide and conquer approach. However, very little work has been conducted on scaling up the separate and conquer approach.In this work we describe a parallel framework that allows the parallelisation of a certain family of separate and conquer algorithms, the Prism family. Parallelisation helps the Prism family of algorithms to harvest additional computer resources in a network of computers in order to make the induction of classification rules scale better on large datasets. Our framework also incorporates a pre-pruning facility for parallel Prism algorithms.
Resumo:
The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk