954 resultados para software distribution in using status
Resumo:
This study investigated the relative contribution of ion-trapping, microsomal binding, and distribution of unbound drug as determinants in the hepatic retention of basic drugs in the isolated perfused rat liver. The ionophore monensin was used to abolish the vesicular proton gradient and thus allow an estimation of ion-trapping by acidic hepatic vesicles of cationic drugs. In vitro microsomal studies were used to independently estimate microsomal binding and metabolism. Hepatic vesicular ion-trapping, intrinsic elimination clearance, permeability-surface area product, and intracellular binding were derived using a physiologically based pharmacokinetic model. Modeling showed that the ion-trapping was significantly lower after monensin treatment for atenolol and propranolol, but not for antipyrine. However, no changes induced by monensin treatment were observed in intrinsic clearance, permeability, or binding for the three model drugs. Monensin did not affect binding or metabolic activity in vitro for the drugs. The observed ion-trapping was similar to theoretical values estimated using the pHs and fractional volumes of the acidic vesicles and the pK(a) values of drugs. Lipophilicity and pK(a) determined hepatic drug retention: a drug with low pK(a) and low lipophilicity (e.g., antipyrine) distributes as unbound drug, a drug with high pK(a) and low lipophilicity (e.g., atenolol) by ion-trapping, and a drug with a high pK(a) and high lipophilicity (e.g., propranolol) is retained by ion-trapping and intracellular binding. In conclusion, monensin inhibits the ion-trapping of high pK(a) basic drugs, leading to a reduction in hepatic retention but with no effect on hepatic drug extraction.
Resumo:
Software Configuration Management is the discipline of managing large collections of software development artefacts from which software products are built. Software configuration management tools typically deal with artefacts at fine levels of granularity - such as individual source code files - and assist with coordination of changes to such artefacts. This paper describes a lightweight tool, designed to be used on top of a traditional file-based configuration management system. The add-on tool support enables users to flexibly define new hierarchical views of product structure, independent of the underlying artefact-repository structure. The tool extracts configuration and change data with respect to the user-defined hierarchy, leading to improved visibility of how individual subsystems have changed. The approach yields a range of new capabilities for build managers, and verification and validation teams. The paper includes a description of our experience using the tool in an organization that builds large embedded software systems.
Resumo:
This study introduces the use of combined Na-23 magnetic resonance imaging (MRI) and Na-23 NMR relaxometry for the study of meat curing. The diffusion of sodium ions into the meat was measured using Na-23 MRI on a 1 kg meat sample brined in 10% w/w NaCl for 3-100 h. Calculations revealed a diffusion coefficient of 1 x 10(-5) cm(2)/s after 3 h of curing and subsequently decreasing to 8 x 10(-6) cm(2)/s at longer curing times, suggesting that changes occur in the microscopic structure of the meat during curing. The microscopic mobility and distribution of sodium was measured using Na-23 relaxometry. Two sodium populations were observed, and with increasing length of curing time the relaxation times of these changed, reflecting a salt-induced swelling and increase in myofibrillar pore sizes. Accordingly, the present study demonstrated that pore size and thereby salt-induced swelling in meat can be assessed using Na-23 relaxometry.
Resumo:
Smoke inhalation injuries are the leading cause of mortality from burn injury. Airway obstruction due to mucus plugging and bronchoconstriction can cause severe ventilation inhomogeneity and worsen hypoxia. Studies describing changes of viscoelastic characteristics of the lung after smoke inhalation are missing. We present results of a new smoke inhalation device in sheep and describe pathophysiological changes after smoke exposure. Fifteen female Merino ewes were anesthetized and intubated. Baseline data using electrical impedance tomography and multiple-breath inert-gas washout were obtained by measuring ventilation distribution, functional residual capacity, lung clearance index, dynamic compliance, and stress index. Ten sheep were exposed to standardized cotton smoke insufflations and five sheep to sham smoke insufflations. Measured carboxyhemoglobin before inhalation was 3.87 +/- 0.28% and 5 min after smoke was 61.5 +/- 2.1%, range 50-69.4% ( P < 0.001). Two hours after smoke functional residual capacity decreased from 1,773 +/- 226 to 1,006 +/- 129 ml and lung clearance index increased from 10.4 +/- 0.4 to 14.2 +/- 0.9. Dynamic compliance decreased from 56.6 +/- 5.5 to 32.8 +/- 3.2 ml/ cmH(2)O. Stress index increased from 0.994 +/- 0.009 to 1.081 +/- 0.011 ( P < 0.01) ( all means +/- SE, P < 0.05). Electrical impedance tomography showed a shift of ventilation from the dependent to the independent lung after smoke exposure. No significant change was seen in the sham group. Smoke inhalation caused immediate onset in pulmonary dysfunction and significant ventilation inhomogeneity. The smoke inhalation device as presented may be useful for interventional studies.
Resumo:
The contributions in this research are split in to three distinct, but related, areas. The focus of the work is based on improving the efficiency of video content distribution in the networks that are liable to packet loss, such as the Internet. Initially, the benefits and limitations of content distribution using Forward Error Correction (FEC) in conjunction with the Transmission Control Protocol (TCP) is presented. Since added FEC can be used to reduce the number of retransmissions, the requirement for TCP to deal with any losses is greatly reduced. When real-time applications are needed, delay must be kept to a minimum, and retransmissions not desirable. A balance, therefore, between additional bandwidth and delays due to retransmissions must be struck. This is followed by the proposal of a hybrid transport, specifically for H.264 encoded video, as a compromise between the delay-prone TCP and the loss-prone UDP. It is argued that the playback quality at the receiver often need not be 100% perfect, providing a certain level is assured. Reliable TCP is used to transmit and guarantee delivery of the most important packets. The delay associated with the proposal is measured, and the potential for use as an alternative to the conventional methods of transporting video by either TCP or UDP alone is demonstrated. Finally, a new objective measurement is investigated for assessing the playback quality of video transported using TCP. A new metric is defined to characterise the quality of playback in terms of its continuity. Using packet traces generated from real TCP connections in a lossy environment, simulating the playback of a video is possible, whilst monitoring buffer behaviour to calculate pause intensity values. Subjective tests are conducted to verify the effectiveness of the metric introduced and show that the results of objective and subjective scores made are closely correlated.
Resumo:
Strontium has been substituted for calcium in the glass series (SiO2)49.46(Na2O)26.38(P2O5)1.07(CaO)23.08x(SrO)x (where x = 0, 11.54, 23.08) to elucidate their underlying atomic-scale structural characteristics as a basis for understanding features related to the bioactivity. These bioactive glasses have been investigated using isomorphic neutron and X-ray diffraction, Sr K-edge EXAFS and solid state 17O, 23Na, 29Si, 31P and 43Ca magic-angle-spinning (MAS) NMR. An effective isomorphic substitution first-order difference function has been applied to the neutron diffraction data, confirming that Ca and Sr behave in a similar manner within the glass network, with residual differences attributed to solely the variation in ionic radius between the two species. The diffraction data provides the first direct experimental evidence of split Ca–O nearest-neighbour correlations in these melt quench bioactive glasses, together with an analogous splitting of the Sr–O correlations; the correlations are attributed to the metal ions correlated either to bridging or to non-bridging oxygen atoms. Triple quantum (3Q) 43Ca MAS NMR corroborates the split Ca–O correlations. Successful simplification of the 2 < r (A) < 3 region via the difference method has also revealed two distinct Na environments. These environments are attributed to sodium correlated either to bridging or to nonbridging oxygen atoms. Complementary multinuclear MAS NMR, Sr K-edge EXAFS and X-ray diffraction data supports the structural model presented. The structural sites present will be intimately related to their release properties in physiological fluids such as plasma and saliva, and hence the bioactivity of the material. Detailed structural knowledge is therefore a prerequisite for optimising material design.
Resumo:
Agile methodologies are becoming more popular in the software development process nowadays. The iterative development lifecycle, openness to frequent changes, tight cooperation with the client and among the software engineers are turning into more and more effective practices and respond to a higher extend to the current business needs. It is natural to raise the question which methodology is the most suitable for use when starting and managing a project. This depends on many factors—product characteristics, technologies used, client’s and developer’s experience, project type. A systematic analysis of the most common problems appearing when developing a particular type of projects—public portal solutions, is proposed. In the case at hand a very close interaction with various types of end users is observed. This is a prerequisite for permanent changes during the development and support cycles, which makes them ideal candidates for using an agile methodology. We will compare the ways in which each methodology addresses the specific problems arising and will finish with ranking them according to their relevance. This might help the project manager in choosing one or a combination of the methodologies.
Resumo:
2000 Mathematics Subject Classification: 60J80, 60J85, 62P10, 92D25.
Resumo:
Higher and further education institutions are increasingly using social software tools to support teaching and learning. A growing body of research investigates the diversity of tools and their range of contributions. However, little research has focused on investigating the role of the educator in the context of a social software initiative, even though the educator is critical for the introduction and successful use of social software in a course environment. Hence, we argue that research on social software should place greater emphasis on the educators, as their roles and activities (such as selecting the tools, developing the tasks and facilitating the student interactions on these tools) are instrumental to most aspects of a social software initiative. To this end, we have developed an agenda for future research on the role of the educator. Drawing on role theory, both as the basis for a systematic conceptualization of the educator role and as a guiding framework, we have developed a series of concrete research questions that address core issues associated with the educator roles in a social software context and provide recommendations for further investigations. By developing a research agenda we hope to stimulate research that creates a better understanding of the educator’s situation and develops guidelines to help educators carry out their social software initiatives. Considering the significant role an educator plays in the initiation and conduct of a social software initiative, our research agenda ultimately seeks to contribute to the adoption and efficient use of social software in the educational domain.
Resumo:
Software architecture plays an essential role in the high level description of a system design, where the structure and communication are emphasized. Despite its importance in the software engineering process, the lack of formal description and automated verification hinders the development of good software architecture models. In this paper, we present an approach to support the rigorous design and verification of software architecture models using the semantic web technology. We view software architecture models as ontology representations, where their structures and communication constraints are captured by the Web Ontology Language (OWL) and the Semantic Web Rule Language (SWRL). Specific configurations on the design are represented as concrete instances of the ontology, to which their structures and dynamic behaviors must conform. Furthermore, ontology reasoning tools can be applied to perform various automated verification on the design to ensure correctness, such as consistency checking, style recognition, and behavioral inference.
Resumo:
Modern software systems are often large and complicated. To better understand, develop, and manage large software systems, researchers have studied software architectures that provide the top level overall structural design of software systems for the last decade. One major research focus on software architectures is formal architecture description languages, but most existing research focuses primarily on the descriptive capability and puts less emphasis on software architecture design methods and formal analysis techniques, which are necessary to develop correct software architecture design. ^ Refinement is a general approach of adding details to a software design. A formal refinement method can further ensure certain design properties. This dissertation proposes refinement methods, including a set of formal refinement patterns and complementary verification techniques, for software architecture design using Software Architecture Model (SAM), which was developed at Florida International University. First, a general guideline for software architecture design in SAM is proposed. Second, specification construction through property-preserving refinement patterns is discussed. The refinement patterns are categorized into connector refinement, component refinement and high-level Petri nets refinement. These three levels of refinement patterns are applicable to overall system interaction, architectural components, and underlying formal language, respectively. Third, verification after modeling as a complementary technique to specification refinement is discussed. Two formal verification tools, the Stanford Temporal Prover (STeP) and the Simple Promela Interpreter (SPIN), are adopted into SAM to develop the initial models. Fourth, formalization and refinement of security issues are studied. A method for security enforcement in SAM is proposed. The Role-Based Access Control model is formalized using predicate transition nets and Z notation. The patterns of enforcing access control and auditing are proposed. Finally, modeling and refining a life insurance system is used to demonstrate how to apply the refinement patterns for software architecture design using SAM and how to integrate the access control model. ^ The results of this dissertation demonstrate that a refinement method is an effective way to develop a high assurance system. The method developed in this dissertation extends existing work on modeling software architectures using SAM and makes SAM a more usable and valuable formal tool for software architecture design. ^
Resumo:
Current technology permits connecting local networks via high-bandwidth telephone lines. Central coordinator nodes may use Intelligent Networks to manage data flow over dialed data lines, e.g. ISDN, and to establish connections between LANs. This dissertation focuses on cost minimization and on establishing operational policies for query distribution over heterogeneous, geographically distributed databases. Based on our study of query distribution strategies, public network tariff policies, and database interface standards we propose methods for communication cost estimation, strategies for the reduction of bandwidth allocation, and guidelines for central to node communication protocols. Our conclusion is that dialed data lines offer a cost effective alternative for the implementation of distributed database query systems, and that existing commercial software may be adapted to support query processing in heterogeneous distributed database systems. ^
Resumo:
Selenium is known to occur in the enzyme, glutathione peroxidase, and plays an important role as an antioxidant. The objective of this investigation was to determine if amounts of selenium are selectively accumulated in different regions of the retina or uniformly distributed with eccentricity. 20 human retinas were analyzed for selenium. 18 of these were sectioned into a disc and two concentric annuli centered on the fovea using trephines having diameters of 3, 11, and 21 mm. The sections had areas of7.1, 93, and 343 mm2, respectively. Corresponding sections of these retinas were combined and analyzed together in sets of n = 5 and n = 11. For two donors, the whole retina of one eye was analyzed for selenium and the other retina was sectioned for analysis as described above. Selenium was determined using atomic fluorescence spectroscopy after digestion of the retinal tissues in nitric acid. The two whole retinas were found to have an average of 0.89 ± 0.49 pmoles/mm2 of selenium as compared to the companion which had 0.84 ± 0.28 pmoles/mm2 as determined from the sum of the selenium amounts measured in the individual sections. The inner, medial, and outer portions of these two sectioned retinas were found to contain an average of5.28 ± 1.1, 1.28 ± 0.44, 0.63 ± 0.22 pmoles/mm2, respectively. The five retinas that were sectioned and pooled for analysis were found to have average amounts of3.64, 1.26, and 0.56 pmoles/mm2 • The 11-sectioned retinas were found to have 1.16, 0.61, and 0.38 pmoles/mm2 respectively in the same three sections. This limited data set indicates that selenium is not uniformly distributed within the human retina but rather concentrated to a greater extent within the macula. If confirmed, these data would support the hypothesis that selenium may be an important antioxidant involved in protection of the macula from radical oxidants.
Resumo:
Sediment dynamics on a storm-dominated shelf (western Bay of Plenty, New Zealand) were mapped and analyzed using the newly developed multi-sensor benthic profiler MARUM NERIDIS III. An area of 60 km × 7 km between 2 and 35 m water depth was surveyed with this bottom-towed sled equipped with a high-resolution camera for continuous close-up seafloor photography and a CTD with connected turbidity sensor. Here we introduce our approach of using this multi-parameter dataset combined with sidescan sonography and sedimentological analyses to create detailed lithofacies and bedform distribution maps and to derive regional sediment transport patterns. For the assessment of sediment distribution, photographs were classified and their spatial distribution mapped out according to associated acoustic backscatter from a sidescan sonar. This provisional map was used to choose target locations for surficial sediment sampling and subsequent laboratory analysis of grain size distribution and mineralogical composition. Finally, photographic, granulometric and mineralogical facies were combined into a unified lithofacies map and corresponding stratigraphic model. Eight distinct types of lithofacies with seawards increasing grain size were discriminated and interpreted as reworked relict deposits overlain by post-transgressional fluvial sediments. The dominant transport processes in different water depths were identified based on type and orientation of bedforms, as well as bottom water turbidity and lithofacies distribution. Observed bedforms include subaquatic dunes, coarse sand ribbons and sorted bedforms of varying dimensions, which were interpreted as being initially formed by erosion. Under fair weather conditions, sediment is transported from the northwest towards the southeast by littoral drift. During storm events, a current from the southeast to the northweast is induced which is transporting sediment along the shore in up to 35 m water depth. Shorewards oriented cross-shore transport is taking place in up to 60 m water depth and is likewise initiated by storm events. Our study demonstrates how benthic photographic profiling delivers comprehensive compositional, structural and environmental information, which compares well with results obtained by traditional probing methods, but offers much higher spatial resolution while covering larger areas. Multi-sensor benthic profiling enhances the interpretability of acoustic seafloor mapping techniques and is a rapid and economic approach to seabed and habitat mapping especially in muddy to sandy facies.