928 resultados para Tablet computers
Resumo:
A variety of applications exist for reverse saturable absorbers (RSAs) in the area of optical pulse processing and computing. An RSA can be used as power limiter/pulse smoother and energy limiter/pulse shortner of laser pulses. A combination of RSA and saturable absorber (SA) can be used for mode locking and pulse shaping between high power laser amplifiers in oscillator amplifier chain. Also, an RSA can be used for the construction of a molecular spatial light modulator (SLM) which acts as an input/output device in optical computers. A detailed review of the theoretical studies of these processes is presented. Current efforts to find RSAs at desired wavelength for testing these theoretical predictions are also discussed.
Resumo:
Use of natural xanthine derivates in medicine is complicated with their physical properties. Theobromine is poorly soluble while theophylline is highly sensitive to hydration. The aim of this study was to improve bioavailability of xanthines by co-crystallization, theophylline was also cocrystallized with carboxylic acids (capric, citric, glutaric, malenic, malonic, oxalic, stearic, succinic) and HPMC. Co-crystallization was performed by slow evaporation and ball milling. Physical stability was checked by wet granulation and water sorption methods, solubility was measured by intrinsic tablet dissolution. Theobromine formed co-crystal with other xanthines and theophylline interacted with all acids except stearic and HPMC, the latter showed alternative interactions based on hydrogen bonding. Hydration resistance was good in theophylline:succinic acid co-crystal and excellent in complexes containing capric, stearic acids and HPMC. Theophylline:HPMC showed improved solubility. The reported approach can promote use of xanthines and can be recommended for other compounds with similar problems.
Resumo:
Instability of laminated curved composite beams made of repeated sublaminate construction is studied using finite element method. In repeated sublaminate construction, a full laminate is obtained by repeating a basic sublaminate which has a smaller number of plies. This paper deals with the determination of optimum lay-up for buckling by ranking of such composite curved beams (which may be solid or sandwich). For this purpose, use is made of a two-noded, 16 degress of freedom curved composite beam finite element. The displacements u, v, w of the element reference axis are expressed in terms of one-dimensional first-order Hermite interpolation polynomials, and line member assumptions are invoked in formulation of the elastic stiffness matrix and geometric stiffness matrix. The nonlinear expressions for the strains, occurring in beams subjected to axial, flexural and torsional loads, are incorporated in a general instability analysis. The computer program developed has been used, after extensive checking for correctness, to obtain optimum orientation scheme of the plies in the sublaminate so as to achieve maximum buckling load for typical curved solid/sandwich composite beams.
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.
Resumo:
The free vibrational characteristics of a beam-column, which is having randomly varying Young's modulus and mass density and subjected to randomly distributed axial loading is analysed. The material property fluctuations and axial loadings are considered to constitute independent one-dimensional, uni-variate, homogeneous real, spatially distributed stochastic fields. Hamilton's principle is used to formulate the problem using stochastic FEM. Vibration frequencies and mode shapes are analysed for their statistical descriptions. A numerical example is shown.
Resumo:
The aim of this study was to investigate powder and tablet behavior at the level of mechanical interactions between single particles. Various aspects of powder packing, mixing, compression, and bond formation were examined with the aid of computer simulations. The packing and mixing simulations were based on spring forces interacting between particles. Packing and breakage simulations included systems in which permanent bonds were formed and broken between particles, based on their interaction strengths. During the process, a new simulation environment based on Newtonian mechanics and elementary interactions between the particles was created, and a new method for evaluating mixing was developed. Powder behavior is a complicated process, and many of its aspects are still unclear. Powders as a whole exhibit some aspects of solids and others of liquids. Therefore, their physics is far from clear. However, using relatively simple models based on particle-particle interaction, many powder properties could be replicated during this work. Simulated packing densities were similar to values reported in the literature. The method developed for describing powder mixing correlated well with previous methods. The new method can be applied to determine mixing in completely homogeneous materials, without dividing them into different components. As such, it can describe the efficiency of the mixing method, regardless of the powder's initial setup. The mixing efficiency at different vibrations was examined, and we found that certain combinations of amplitude, direction, and frequencies resulted in better mixing while using less energy. Simulations using exponential force potentials between particles were able to explain the elementary compression behavior of tablets, and create force distributions that were similar to the pressure distributions reported in the literature. Tablet-breaking simulations resulted in breaking strengths that were similar to measured tablet breaking strengths. In general, many aspects of powder behavior can be explained with mechanical interactions at the particle level, and single particle properties can be reliably linked to powder behavior with accurate simulations.
Resumo:
This work is a case study of applying nonparametric statistical methods to corpus data. We show how to use ideas from permutation testing to answer linguistic questions related to morphological productivity and type richness. In particular, we study the use of the suffixes -ity and -ness in the 17th-century part of the Corpus of Early English Correspondence within the framework of historical sociolinguistics. Our hypothesis is that the productivity of -ity, as measured by type counts, is significantly low in letters written by women. To test such hypotheses, and to facilitate exploratory data analysis, we take the approach of computing accumulation curves for types and hapax legomena. We have developed an open source computer program which uses Monte Carlo sampling to compute the upper and lower bounds of these curves for one or more levels of statistical significance. By comparing the type accumulation from women’s letters with the bounds, we are able to confirm our hypothesis.
Resumo:
"Fifty-six teachers, from four European countries, were interviewed to ascertain their attitudes to and beliefs about the Collaborative Learning Environments (CLEs) which were designed under the Innovative Technologies for Collaborative Learning Project. Their responses were analysed using categories based on a model from cultural-historical activity theory [Engestrom, Y. (1987). Learning by expanding.- An activity-theoretical approach to developmental research. Helsinki: Orienta-Konsultit; Engestrom, Y., Engestrom, R., & Suntio, A. (2002). Can a school community learn to master its own future? An activity-theoretical study of expansive learning among middle school teachers. In G. Wells & G. Claxton (Eds.), Learning for life in the 21st century. Oxford: Blackwell Publishers]. The teachers were positive about CLEs and their possible role in initiating pedagogical innovation and enhancing personal professional development. This positive perception held across cultures and national boundaries. Teachers were aware of the fact that demanding planning was needed for successful implementations of CLEs. However, the specific strategies through which the teachers can guide students' inquiries in CLEs and the assessment of new competencies that may characterize student performance in the CLEs were poorly represented in the teachers' reflections on CLEs. The attitudes and beliefs of the teachers from separate countries had many similarities, but there were also some clear differences, which are discussed in the article. (c) 2005 Elsevier Ltd. All rights reserved."
Resumo:
Seepage through sand bed channels in a downward direction (suction) reduces the stability of particles and initiates the sand movement. Incipient motion of sand bed channel with seepage cannot be designed by using the conventional approach. Metamodeling techniques, which employ a non-linear pattern analysis between input and output parameters and solely based on the experimental observations, can be used to model such phenomena. Traditional approach to find non-dimensional parameters has not been used in the present work. Parameters, which can influence the incipient motion with seepage, have been identified and non-dimensionalized in the present work. Non-dimensional stream power concept has been used to describe the process. By using these non-dimensional parameters; present work describes a radial basis function (RBF) metamodel for prediction of incipient motion condition affected by seepage. The coefficient of determination, R-2 of the model is 0.99. Thus, it can be said that model predicts the phenomena very well. With the help of the metamodel, design curves have been presented for designing the sand bed channel when it is affected by seepage. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
A key problem in helicopter aeroelastic analysis is the enormous computational time required for a numerical solution of the nonlinear system of algebraic equations required for trim, particularly when free wake models are used. Trim requires calculation of the main rotor and tail rotor controls and the vehicle attitude which leads to the six steady forces and moments about the helicopter center of gravity to be zero. An appropriate initial estimate of the trim state is needed for successful helicopter trim. This study aims to determine the control inputs that can have considerable effect on the convergence of trim solution in the aeroelastic analysis of helicopter rotors by investigating the basin of attraction of the nonlinear equations (set of initial guess points from which the nonlinear equations converge). It is illustrated that the three main rotor pitch controls of collective pitch, longitudinal cyclic pitch and lateral cyclic pitch have a significant contribution to the convergence of the trim solution. Trajectories of the Newton iterates are shown and some ideas for accelerating the convergence of a trim solution in the aeroelastic analysis of helicopters are proposed. It is found that the basins of attraction can have fractal boundaries. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
A model comprising several servers, each equipped with its own queue and with possibly different service speeds, is considered. Each server receives a dedicated arrival stream of jobs; there is also a stream of generic jobs that arrive to a job scheduler and can be individually allocated to any of the servers. It is shown that if the arrival streams are all Poisson and all jobs have the same exponentially distributed service requirements, the probabilistic splitting of the generic stream that minimizes the average job response time is such that it balances the server idle times in a weighted least-squares sense, where the weighting coefficients are related to the service speeds of the servers. The corresponding result holds for nonexponentially distributed service times if the service speeds are all equal. This result is used to develop adaptive quasi-static algorithms for allocating jobs in the generic arrival stream when the load parameters are unknown. The algorithms utilize server idle-time measurements which are sent periodically to the central job scheduler. A model is developed for these measurements, and the result mentioned is used to cast the problem into one of finding a projection of the root of an affine function, when only noisy values of the function can be observed
Resumo:
Static and vibration problems of an indeterminate continuum are traditionally analyzed by the stiffness method. The force method is more or less non-existent for such problems. This situation is primarily due to the incomplete state of development of the compatibility conditions which are essential for the analysis of indeterminate structures by the flexibility method. The understanding of the Compatibility Conditions (CC) has been substantially augmented. Based on the understanding of CC, a novel formulation termed the Integrated Force Method (IFM) has been established. In this paper IFM has been extended for the static and vibration analyses of a continuum. The IFM analysis is illustrated taking three examples: 1. (1) rectangular plate in flexure 2. (2) analysis of a cantilevered dam 3. (3) free vibration analysis of a beam. From the examples solved it is observed that the force response of an indeterminate continuum with mixed boundary conditions can be generated by IFM without any reference to displacements in the field or on the boundary. Displacements if required can be calculated by back substitution.
Resumo:
This paper describes the architecture of a multiprocessor system which we call the Broadcast Cube System (BCS) for solving important computation intensive problems such as systems of linear algebraic equations and Partial Differential Equations (PDEs), and highlights its features. Further, this paper presents an analytical performance study of the BCS, and it describes the main details of the design and implementation of the simulator for the BCS.