833 resultados para Using mobile phones for development
Resumo:
Cross-platform development frameworks for mobile applications promise important advantages in cost cuttings and easy maintenance, posing as a very good option for organizations interested in the design of mobile applications for several platforms. Given that platform conventions are especially important for the User eXperience (UX) of mobile applications, the usage of a framework where the same code defines the behavior of the app in different platforms could have a negative impact in the UX. This paper describes a study where two independent teams have designed two different versions of a mobile application, one using a framework that generates Android and iOS versions automatically, and another team using native tools. The alternative versions for each platform have been evaluated with 37 users with a combination of a laboratory usability test and a longitudinal study. The results show that differences are minimal in the Android platform, but in iOS, even if a reasonably good UX can be obtained with the usage of this framework by an UX-conscious design team, a higher level of UX can be obtained directly developing with a native tool.
Resumo:
In this project, we propose the implementation of a 3D object recognition system which will be optimized to operate under demanding time constraints. The system must be robust so that objects can be recognized properly in poor light conditions and cluttered scenes with significant levels of occlusion. An important requirement must be met: the system must exhibit a reasonable performance running on a low power consumption mobile GPU computing platform (NVIDIA Jetson TK1) so that it can be integrated in mobile robotics systems, ambient intelligence or ambient assisted living applications. The acquisition system is based on the use of color and depth (RGB-D) data streams provided by low-cost 3D sensors like Microsoft Kinect or PrimeSense Carmine. The range of algorithms and applications to be implemented and integrated will be quite broad, ranging from the acquisition, outlier removal or filtering of the input data and the segmentation or characterization of regions of interest in the scene to the very object recognition and pose estimation. Furthermore, in order to validate the proposed system, we will create a 3D object dataset. It will be composed by a set of 3D models, reconstructed from common household objects, as well as a handful of test scenes in which those objects appear. The scenes will be characterized by different levels of occlusion, diverse distances from the elements to the sensor and variations on the pose of the target objects. The creation of this dataset implies the additional development of 3D data acquisition and 3D object reconstruction applications. The resulting system has many possible applications, ranging from mobile robot navigation and semantic scene labeling to human-computer interaction (HCI) systems based on visual information.
Resumo:
This dissertation studies the context-aware application with its proposed algorithms at client side. The required context-aware infrastructure is discussed in depth to illustrate that such an infrastructure collects the mobile user’s context information, registers service providers, derives mobile user’s current context, distributes user context among context-aware applications, and provides tailored services. The approach proposed tries to strike a balance between the context server and mobile devices. The context acquisition is centralized at the server to ensure the reusability of context information among mobile devices, while context reasoning remains at the application level. Hence, a centralized context acquisition and distributed context reasoning are viewed as a better solution overall. The context-aware search application is designed and implemented at the server side. A new algorithm is proposed to take into consideration the user context profiles. By promoting feedback on the dynamics of the system, any prior user selection is now saved for further analysis such that it may contribute to help the results of a subsequent search. On the basis of these developments at the server side, various solutions are consequently provided at the client side. A proxy software-based component is set up for the purpose of data collection. This research endorses the belief that the proxy at the client side should contain the context reasoning component. Implementation of such a component provides credence to this belief in that the context applications are able to derive the user context profiles. Furthermore, a context cache scheme is implemented to manage the cache on the client device in order to minimize processing requirements and other resources (bandwidth, CPU cycle, power). Java and MySQL platforms are used to implement the proposed architecture and to test scenarios derived from user’s daily activities. To meet the practical demands required of a testing environment without the impositions of a heavy cost for establishing such a comprehensive infrastructure, a software simulation using a free Yahoo search API is provided as a means to evaluate the effectiveness of the design approach in a most realistic way. The integration of Yahoo search engine into the context-aware architecture design proves how context aware application can meet user demands for tailored services and products in and around the user’s environment. The test results show that the overall design is highly effective, providing new features and enriching the mobile user’s experience through a broad scope of potential applications.
Resumo:
This dissertation studies the context-aware application with its proposed algorithms at client side. The required context-aware infrastructure is discussed in depth to illustrate that such an infrastructure collects the mobile user’s context information, registers service providers, derives mobile user’s current context, distributes user context among context-aware applications, and provides tailored services. The approach proposed tries to strike a balance between the context server and mobile devices. The context acquisition is centralized at the server to ensure the usability of context information among mobile devices, while context reasoning remains at the application level. Hence, a centralized context acquisition and distributed context reasoning are viewed as a better solution overall. The context-aware search application is designed and implemented at the server side. A new algorithm is proposed to take into consideration the user context profiles. By promoting feedback on the dynamics of the system, any prior user selection is now saved for further analysis such that it may contribute to help the results of a subsequent search. On the basis of these developments at the server side, various solutions are consequently provided at the client side. A proxy software-based component is set up for the purpose of data collection. This research endorses the belief that the proxy at the client side should contain the context reasoning component. Implementation of such a component provides credence to this belief in that the context applications are able to derive the user context profiles. Furthermore, a context cache scheme is implemented to manage the cache on the client device in order to minimize processing requirements and other resources (bandwidth, CPU cycle, power). Java and MySQL platforms are used to implement the proposed architecture and to test scenarios derived from user’s daily activities. To meet the practical demands required of a testing environment without the impositions of a heavy cost for establishing such a comprehensive infrastructure, a software simulation using a free Yahoo search API is provided as a means to evaluate the effectiveness of the design approach in a most realistic way. The integration of Yahoo search engine into the context-aware architecture design proves how context aware application can meet user demands for tailored services and products in and around the user’s environment. The test results show that the overall design is highly effective,providing new features and enriching the mobile user’s experience through a broad scope of potential applications.
Resumo:
This paper outlines the development of a crosscorrelation algorithm and a spiking neural network (SNN) for sound localisation based on real sound recorded in a noisy and dynamic environment by a mobile robot. The SNN architecture aims to simulate the sound localisation ability of the mammalian auditory pathways by exploiting the binaural cue of interaural time difference (ITD). The medial superior olive was the inspiration for the SNN architecture which required the integration of an encoding layer which produced biologically realistic spike trains, a model of the bushy cells found in the cochlear nucleus and a supervised learning algorithm. The experimental results demonstrate that biologically inspired sound localisation achieved using a SNN can compare favourably to the more classical technique of cross-correlation.
Resumo:
Recent advances in mobile phone cameras have poised them to take over compact hand-held cameras as the consumer’s preferred camera option. Along with advances in the number of pixels, motion blur removal, face-tracking, and noise reduction algorithms have significant roles in the internal processing of the devices. An undesired effect of severe noise reduction is the loss of texture (i.e. low-contrast fine details) of the original scene. Current established methods for resolution measurement fail to accurately portray the texture loss incurred in a camera system. The development of an accurate objective method to identify the texture preservation or texture reproduction capability of a camera device is important in this regard. The ‘Dead Leaves’ target has been used extensively as a method to measure the modulation transfer function (MTF) of cameras that employ highly non-linear noise-reduction methods. This stochastic model consists of a series of overlapping circles with radii r distributed as r−3, and having uniformly distributed gray level, which gives an accurate model of occlusion in a natural setting and hence mimics a natural scene. This target can be used to model the texture transfer through a camera system when a natural scene is captured. In the first part of our study we identify various factors that affect the MTF measured using the ‘Dead Leaves’ chart. These include variations in illumination, distance, exposure time and ISO sensitivity among others. We discuss the main differences of this method with the existing resolution measurement techniques and identify the advantages. In the second part of this study, we propose an improvement to the current texture MTF measurement algorithm. High frequency residual noise in the processed image contains the same frequency content as fine texture detail, and is sometimes reported as such, thereby leading to inaccurate results. A wavelet thresholding based denoising technique is utilized for modeling the noise present in the final captured image. This updated noise model is then used for calculating an accurate texture MTF. We present comparative results for both algorithms under various image capture conditions.
Development of instrumentation for amperometric and coulometric detection using ultramicroelectrodes
Resumo:
In this work it is presented the development of a simple, portable and inexpensive instrumentation for amperometric and coulometric detection in different analytical instrumentation systems utilizing ultramicroelectrodes. The software, developed in LabVIEW 7.1TM, is capable to carry out three main detection techniques (amperometric, pulsed amperometric and coulometric detection) and a voltammetric technique (cyclic voltammetry). The instrumentation was successfully evaluated using the following systems: cyclic voltammograms of metallic electrodes in alkaline solutions, flow electrochemical detection of glucose and glycine and direct determination of herbicide glyphosate (electrochemical detection coupled to HPLC).
Resumo:
The relationship between the level of cell confluence near the plateau phase of growth and blastocyst yield following somatic cell cloning is not well understood. We examined the effect of distinct cell culture confluence levels on in vitro development of cloned bovine embryos. In vitro-matured bovine oocytes were manually bisected and selected by DNA staining. One or two enucleated hemi-cytoplasts were paired and fused with an adult skin somatic cell. Cultured skin cells from an adult Nellore cow harvested at three distinct culture confluence levels (70-80, 80-90, and > 95%) were used for construction of embryos and hemi-embryos. After activation, structures were cultured in vitro as one embryo (1 x 100%) or as aggregates of two hemi-embryos (2 x 50%) per microwell. Fusion, cleavage and blastocyst rates were compared using the chi(2) test. The fusion rate for hemi-embryos (51.4%) was lower than for embryos (67.6%), with no influence of degree of cell confluence. However, blastocyst rates improved linearly (7.0, 17.5, and 29.4%) with increases in cell confluence. We conclude that degree of cell culture confluence significantly influences subsequent embryo development; use of a cell population in high confluence (> 90%) for nuclear transfer significantly improved blastocyst yield after cloning.
Resumo:
The objective of this work was to develop and validate a rapid Reversed-Phase High-Performance Liquid Chromatography method for the quantification of 3,5,3 '-triiodothyroacetic acid (TRIAC) in nanoparticles delivery system prepared in different polymeric matrices. Special attention was given to developing a reliable reproductive technique for the pretreatment of the samples. Chromatographic runs were performed on an Agilent 1200 Series HPLC with a RP Phenomenex (R) Gemini C18 (150 x 4, 6 mm i.d., 5 mu m) column using acetonitrile and triethylamine buffer 0.1% (TEA) (40 : 60 v/v) as a mobile phase in an isocratic elution, pH 5.6 at a flow rate of 1 ml min(-1). TRIAC was detected at a wavelength of 220 nm. The injection volume was 20 mu l and the column temperature was maintained at 35 degrees C. The validation characteristics included accuracy, precision, specificity, linearity, recovery, and robustness. The standard curve was found to have a linear relationship (r(2) - 0.9996) over the analytical range of 5-100 mu g ml(-1) . The detection and quantitation limits were 1.3 and 3.8 mu g ml(-1), respectively. The recovery and loaded TRIAC in colloidal system delivery was nearly 100% and 98%, respectively. The method was successfully applied in polycaprolactone, polyhydroxybutyrate, and polymethylmethacrylate nanoparticles.
Resumo:
A green ceramic tape micro-heat exchanger was developed using Low Temperature Co-fired Ceramics technology (LTCC). The device was designed by using Computational Aided Design software and simulations were made using a Computational Fluid Dynamics package (COMSOL Multiphysics) to evaluate the homogeneity of fluid distribution in the microchannels. Four geometries were proposed and simulated in two and three dimensions to show that geometric details directly affect the distribution of velocity in the micro-heat exchanger channels. The simulation results were quite useful for the design of the microfluidic device. The micro-heat exchanger was then constructed using the LTCC technology and is composed of five thermal exchange plates in cross-flow arrangement and two connecting plates, with all plates stacked to form a device with external dimensions of 26 x 26 x 6 mm(3).
Resumo:
The development and fabrication of a thermo-electro-optic sensor using a Mach-Zehnder interferometer and a resistive micro-heater placed in one of the device`s arms is presented. The Mach-Zehnder structure was fabricated on a single crystal silicon substrate using silicon oxynitride and amorphous hydrogenated silicon carbide films to form an anti-resonant reflective optical waveguide. The materials were deposited by Plasma enhanced chemical vapor deposition technique at low temperatures (similar to 320 degrees C). To optimize the heat transfer and increase the device response with current variation, part of the Mach-Zehnder sensor arm was suspended through front-side bulk micromachining of the silicon substrate in a KOH solution. With the temperature variation caused by the micro-heater, the refractive index of the core layer of the optical waveguide changes due to the thermo-optic effect. Since this variation occurs only in one of the Mach-Zehnder`s arm, a phase difference between the arms is produced, leading to electromagnetic interference. In this way, the current applied to the micro-resistor can control the device output optical power. Further, reactive ion etching technique was used in this work to define the device`s geometry, and a study of SF6 based etching rates on different composition of silicon oxynitride films is also presented. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The present investigation is the first part of an initiative to prepare a regional map of the natural abundance of selenium in various areas of Brazil, based on the analysis of bean and soil samples. Continuous-flow hydride generation electrothermal atomic absorption spectrometry (HG-ET AAS) with in situ trapping on an iridium-coated graphite tube has been chosen because of the high sensitivity and relative simplicity. The microwave-assisted acid digestion for bean and soil samples was tested for complete recovery of inorganic and organic selenium compounds (selenomethionine). The reduction of Se(VI) to Se(IV) was optimized in order to guarantee that there is no back-oxidation, which is of importance when digested samples are not analyzed immediately after the reduction step. The limits of detection and quantification of the method were 30 ng L(-1) Se and 101 ng L(-1) Se, respectively, corresponding to about 3 ng g(-1) and 10 ng g(-1), respectively, in the solid samples, considering a typical dilution factor of 100 for the digestion process. The results obtained for two certified food reference materials (CRM), soybean and rice, and for a soil and sediment CRM confirmed the validity of the investigated method. The selenium content found in a number of selected bean samples varied between 5.5 +/- 0.4 ng g(-1) and 1726 +/- 55 ng g(-1), and that in soil samples varied between 113 +/- 6.5 ng g(-1) and 1692 +/- 21 ng g(-1). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The purpose of this paper was to produce controlled-release matrices with 120 mg of propranolol hydrochloride (PHCl) employing hydroxypropyl methylcellulose (HPMC, Methocel (R) K100) as the gel forming barrier. Although this class of polymers has been commonly used for direct compression, with the intent of use reduced polymer concentrations to achieve controlled drug release, in this study tablets were produced by the wet granulation process. HPMC percentages ranged from 15-34 % and both soluble and non soluble diluents were tested in the 10 proposed tablet compositions. Dissolution testing of matrices was performed over a 12 h period in 1.2 pH medium (the first 2 h) and in pH 6.8 (10 h). Dissolution kinetic analysis was performed by applying Zero-order, First-order and Higuchi models with the aim of elucidating the drug release mechanism. All physical-chemical characteristics such as average weight, friability, hardness, diameter, height, and drug content were in accordance to the pharmacopeial specifications. Taking into account that PHCl is a very soluble drug, low concentrations (15 %) of HPMC were sufficient to reduce the drug release and to promote controlled release of PHCl, presenting good dissolution efficiencies, between 50 % and 63 %. The Higuchi model has presented the best fit to the 15 % HPMC formulations, indicating that the main release mechanism was diffusion. It could be concluded that the application of the wet granulation method reduced matrices erosion and promoted controlled release of the drug at low HPMC percentages.
Resumo:
Vecuronium bromide is a neuromuscular blocking agent used for anesthesia to induce skeletal muscle relaxation. HPLC and CZE analytical methods were developed and validated for the quantitative determination of vecuronium bromide. The HPLC method was achieved on an amino column (Luna 150 x 4.6 mm, 5 mu m) using UV detection at 205 nm. The mobile phase was composed of acetonitrile:water containing 25.0 mmol L(-1) of sodium phosphate monobasic (50:50 v/v), pH 4.6 and flow rate of 1.0 mL min(-1). The CZE method was achieved on an uncoated fused-silica capillary (40.0 cm total length, 31.5 cm effective length and 50 mu m i.d.) using indirect UV detection at 230 nm. The electrolyte comprised 1.0 mmol L(-1) of quinine sulfate dihydrate at pH 3.3 and 8.0% of acetonitrile. The results were used to compare both techniques. No significant differences were observed (p > 0.05).
Resumo:
High-performance liquid-chromatographic (HPLC) methods were validated for determination of pravastatin sodium (PS), fluvastatin sodium (FVS), atorvastatin calcium (ATC), and rosuvastatin calcium (RC) in pharmaceuticals. Two stability-indicating HPLC methods were developed with a small change (10%) in the composition of the organic modifier in the mobile phase. The HPLC method for each statin was validated using isocratic elution. An RP-18 column was used with mobile phases consisting of methanol-water (60:40, v/v, for PS and RC and 70:30, v/v, for FVS and ATC). The pH of each mobile phase was adjusted to 3.0 with orthophosphoric acid, and the flow rate was 1.0mL/min. Calibration plots showed correlation coefficients (r)0.999, which were calculated by the least square method. The detection limit (DL) and quantitation limit (QL) were 1.22 and 3.08 mu g/mL for PS, 2.02 and 6.12 mu g/mL for FVS, 0.44 and 1.34 mu g/mL for ATC, and 1.55 and 4.70 mu g/mL for RC. Intraday and interday relative standard deviations (RSDs) were 2.0%. The methods were applied successfully for quantitative determination of statins in pharmaceuticals.