784 resultados para Wireless Sensor and Actuator Networks. Simulation. Reinforcement Learning. Routing Techniques
Resumo:
The rapid growth of big cities has been noticed since 1950s when the majority of world population turned to live in urban areas rather than villages, seeking better job opportunities and higher quality of services and lifestyle circumstances. This demographic transition from rural to urban is expected to have a continuous increase. Governments, especially in less developed countries, are going to face more challenges in different sectors, raising the essence of understanding the spatial pattern of the growth for an effective urban planning. The study aimed to detect, analyse and model the urban growth in Greater Cairo Region (GCR) as one of the fast growing mega cities in the world using remote sensing data. Knowing the current and estimated urbanization situation in GCR will help decision makers in Egypt to adjust their plans and develop new ones. These plans should focus on resources reallocation to overcome the problems arising in the future and to achieve a sustainable development of urban areas, especially after the high percentage of illegal settlements which took place in the last decades. The study focused on a period of 30 years; from 1984 to 2014, and the major transitions to urban were modelled to predict the future scenarios in 2025. Three satellite images of different time stamps (1984, 2003 and 2014) were classified using Support Vector Machines (SVM) classifier, then the land cover changes were detected by applying a high level mapping technique. Later the results were analyzed for higher accurate estimations of the urban growth in the future in 2025 using Land Change Modeler (LCM) embedded in IDRISI software. Moreover, the spatial and temporal urban growth patterns were analyzed using statistical metrics developed in FRAGSTATS software. The study resulted in an overall classification accuracy of 96%, 97.3% and 96.3% for 1984, 2003 and 2014’s map, respectively. Between 1984 and 2003, 19 179 hectares of vegetation and 21 417 hectares of desert changed to urban, while from 2003 to 2014, the transitions to urban from both land cover classes were found to be 16 486 and 31 045 hectares, respectively. The model results indicated that 14% of the vegetation and 4% of the desert in 2014 will turn into urban in 2025, representing 16 512 and 24 687 hectares, respectively.
Resumo:
The main instrument used in psychological measurement is the self-report questionnaire. One of its majordrawbacks however is its susceptibility to response biases. A known strategy to control these biases hasbeen the use of so-called ipsative items. Ipsative items are items that require the respondent to makebetween-scale comparisons within each item. The selected option determines to which scale the weight ofthe answer is attributed. Consequently in questionnaires only consisting of ipsative items everyrespondent is allotted an equal amount, i.e. the total score, that each can distribute differently over thescales. Therefore this type of response format yields data that can be considered compositional from itsinception.Methodological oriented psychologists have heavily criticized this type of item format, since the resultingdata is also marked by the associated unfavourable statistical properties. Nevertheless, clinicians havekept using these questionnaires to their satisfaction. This investigation therefore aims to evaluate bothpositions and addresses the similarities and differences between the two data collection methods. Theultimate objective is to formulate a guideline when to use which type of item format.The comparison is based on data obtained with both an ipsative and normative version of threepsychological questionnaires, which were administered to 502 first-year students in psychology accordingto a balanced within-subjects design. Previous research only compared the direct ipsative scale scoreswith the derived ipsative scale scores. The use of compositional data analysis techniques also enables oneto compare derived normative score ratios with direct normative score ratios. The addition of the secondcomparison not only offers the advantage of a better-balanced research strategy. In principle it also allowsfor parametric testing in the evaluation
Resumo:
Two spectrophotometric methods are described for the simultaneous determination of ezetimibe (EZE) and simvastatin (SIM) in pharmaceutical preparations. The obtained data was evaluated by using two different chemometric techniques, Principal Component Regression (PCR) and Partial Least-Squares (PLS-1). In these techniques, the concentration data matrix was prepared by using the mixtures containing these drugs in methanol. The absorbance data matrix corresponding to the concentration data matrix was obtained by the measurements of absorbances in the range of 240 - 300 nm in the intervals with Δλ = 1 nm at 61 wavelengths in their zero order spectra, then, calibration or regression was obtained by using the absorbance data matrix and concentration data matrix for the prediction of the unknown concentrations of EZE and SIM in their mixture. The procedure did not require any separation step. The linear range was found to be 5 - 20 µg mL-1 for EZE and SIM in both methods. The accuracy and precision of the methods were assessed. These methods were successfully applied to a pharmaceutical preparation, tablet; and the results were compared with each other.
Resumo:
Resorcinol-formaldehyde (RF) organic gels have been extensively used to produce carbon aerogels. The organic gel synthesis parameters greatly affect the structure of the resulting aerogel. In this study, the influence of the catalyst quantity on the polymeric solution sol-gel process was investigated. Sodium carbonate was used as a basic catalyst. RF gels were synthesized with a resorcinol to formaldehyde molar ratio of 0.5, a resorcinol to catalyst (R/C) molar ratio equal to 50 or 300, and a resorcinol to solvent ratio of 0.1 g mL-1. The sol-gel process was evaluated in situ by Fourier transform infrared spectroscopy using a universal attenuated total reflectance sensor and measurements of the kinematic viscosity. The techniques showed the evolution of the sol-gel process, and the results showed that the lower catalyst quantity induced a higher gel point, with a lower viscosity at the gel point. Differential scanning calorimetry was used to investigate the thermal behavior of the RF dried gel, and results showed that the exothermic event related to the curing process was shifted to higher temperatures for solutions containing higher R/C ratios.
Resumo:
Selling is much maligned, often under-valued subject whose inadequate showing in business schools is in inverse proportion to the many job opportunities it offers and the importance of salespeople bringing incomes to companies. The purpose of this research is to increase the understanding of customer-oriented selling and examine the influence of customer-oriented philosophy on selling process, the applicability of selling techniques to this philosophy and the importance of them to salespeople. The empirical section of the study is two-fold. Firstly, the data of qualitative part was collected by conducting five thematic interviews among sales consultants and case company representatives. The findings of the study indicate that customer-oriented selling requires the activity of salespeople. In the customer-oriented personal selling process, salespeople invest time in the preplanning, the need analysis and the benefit demonstration stages. However, the findings propose that salespeople today must also have the basic capabilities for executing the traditional sales process, and the balance between traditional and consultative selling process will change as the duration of the relationship between the salesperson and customer increases. The study also proposes that selling techniques still belong to the customer-oriented selling process, although their roles might be modest. This thesis mapped 75 selling techniques and the quantitative part of the study explored what selling techniques are considered to be important by salespeople in direct selling industry when they make sales with new and existing customers. Response rate of the survey was 69.5%.
Resumo:
The main instrument used in psychological measurement is the self-report questionnaire. One of its major drawbacks however is its susceptibility to response biases. A known strategy to control these biases has been the use of so-called ipsative items. Ipsative items are items that require the respondent to make between-scale comparisons within each item. The selected option determines to which scale the weight of the answer is attributed. Consequently in questionnaires only consisting of ipsative items every respondent is allotted an equal amount, i.e. the total score, that each can distribute differently over the scales. Therefore this type of response format yields data that can be considered compositional from its inception. Methodological oriented psychologists have heavily criticized this type of item format, since the resulting data is also marked by the associated unfavourable statistical properties. Nevertheless, clinicians have kept using these questionnaires to their satisfaction. This investigation therefore aims to evaluate both positions and addresses the similarities and differences between the two data collection methods. The ultimate objective is to formulate a guideline when to use which type of item format. The comparison is based on data obtained with both an ipsative and normative version of three psychological questionnaires, which were administered to 502 first-year students in psychology according to a balanced within-subjects design. Previous research only compared the direct ipsative scale scores with the derived ipsative scale scores. The use of compositional data analysis techniques also enables one to compare derived normative score ratios with direct normative score ratios. The addition of the second comparison not only offers the advantage of a better-balanced research strategy. In principle it also allows for parametric testing in the evaluation
Resumo:
Samples of whole crop wheat (WCW, n = 134) and whole crop barley (WCB, n = 16) were collected from commercial farms in the UK over a 2-year period (2003/2004 and 2004/2005). Near infrared reflectance spectroscopy (NIRS) was compared with laboratory and in vitro digestibility measures to predict digestible organic matter in the dry matter (DOMD) and metabolisable energy (ME) contents measured in vivo using sheep. Spectral models using the mean spectra of two scans were compared with those using individual spectra (duplicate spectra). Overall NIRS accurately predicted the concentration of chemical components in whole crop cereals apart from crude protein. ammonia-nitrogen, water-soluble carbohydrates, fermentation acids and solubility values. In addition. the spectral models had higher prediction power for in vivo DOMD and ME than chemical components or in vitro digestion methods. Overall there Was a benefit from the use of duplicate spectra rather than mean spectra and this was especially so for predicting in vivo DOMD and ME where the sample population size was smaller. The spectral models derived deal equally well with WCW and WCB and Would he of considerable practical value allowing rapid determination of nutritive value of these forages before their use in diets of productive animals. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Foods that provide medical and health benefits or have a role in disease risk prevention are termed functional foods. The functionality of functional foods is derived from bioactive compounds that are extranutritional constituents present in small quantities in food. Bioactive components include a range of chemical compounds with varying structures such as carotenoids, flavonoids, plant sterols, omega-3 fatty acids (n-3), allyl and diallyl sulfides, indoles (benzopyrroles), and phenolic acids. The increasing consumer interest in natural bioactive compounds has brought about a rise in demand for these kinds of compounds and, in parallel, an increasing number of scientific studies have this type of substance as main topic. The principal aim of this PhD research project was the study of different bioactive and toxic compounds in several natural matrices. To achieve this goal, chromatographic, spectroscopic and sensorial analysis were performed. This manuscript reports the main results obtained in the six activities briefly summarized as follows: • SECTION I: the influence of conventional packaging on lipid oxidation of pasta was evaluated in egg spaghetti. • SECTION II: the effect of the storage at different temperatures of virgin olive oil was monitored by peroxide value, fatty acid activity, OSI test and sensory analysis. • SECTION III: the glucosinolate and phenolic content of 37 rocket salad accessions were evaluated, comparing Eruca sativa and Diplotaxis tenuifolia species. Sensory analysis and the influence of the phenolic and glucosinolate composition on sensory attributes of rocket salads has been also studied. • SECTION IV: ten buckwheat honeys were characterised on the basis of their pollen, physicochemical, phenolic and volatile composition. • SECTION V: the polyphenolic fraction, anthocyanins and other polar compounds, the antioxidant capacity and the anty-hyperlipemic action of the aqueous extract of Hibiscus sabdariffa were achieved. • SECTION VI: the optimization of a normal phase high pressure liquid chromatography–fluorescence detection method for the quantitation of flavanols and procyanidins in cocoa powder and chocolate samples was performed.
Resumo:
Nowadays microfluidic is becoming an important technology in many chemical and biological processes and analysis applications. The potential to replace large-scale conventional laboratory instrumentation with miniaturized and self-contained systems, (called lab-on-a-chip (LOC) or point-of-care-testing (POCT)), offers a variety of advantages such as low reagent consumption, faster analysis speeds, and the capability of operating in a massively parallel scale in order to achieve high-throughput. Micro-electro-mechanical-systems (MEMS) technologies enable both the fabrication of miniaturized system and the possibility of developing compact and portable systems. The work described in this dissertation is towards the development of micromachined separation devices for both high-speed gas chromatography (HSGC) and gravitational field-flow fractionation (GrFFF) using MEMS technologies. Concerning the HSGC, a complete platform of three MEMS-based GC core components (injector, separation column and detector) is designed, fabricated and characterized. The microinjector consists of a set of pneumatically driven microvalves, based on a polymeric actuating membrane. Experimental results demonstrate that the microinjector is able to guarantee low dead volumes, fast actuation time, a wide operating temperature range and high chemical inertness. The microcolumn consists of an all-silicon microcolumn having a nearly circular cross-section channel. The extensive characterization has produced separation performances very close to the theoretical ideal expectations. A thermal conductivity detector (TCD) is chosen as most proper detector to be miniaturized since the volume reduction of the detector chamber results in increased mass and reduced dead volumes. The microTDC shows a good sensitivity and a very wide dynamic range. Finally a feasibility study for miniaturizing a channel suited for GrFFF is performed. The proposed GrFFF microchannel is at early stage of development, but represents a first step for the realization of a highly portable and potentially low-cost POCT device for biomedical applications.
Resumo:
In the present thesis, a new methodology of diagnosis based on advanced use of time-frequency technique analysis is presented. More precisely, a new fault index that allows tracking individual fault components in a single frequency band is defined. More in detail, a frequency sliding is applied to the signals being analyzed (currents, voltages, vibration signals), so that each single fault frequency component is shifted into a prefixed single frequency band. Then, the discrete Wavelet Transform is applied to the resulting signal to extract the fault signature in the frequency band that has been chosen. Once the state of the machine has been qualitatively diagnosed, a quantitative evaluation of the fault degree is necessary. For this purpose, a fault index based on the energy calculation of approximation and/or detail signals resulting from wavelet decomposition has been introduced to quantify the fault extend. The main advantages of the developed new method over existing Diagnosis techniques are the following: - Capability of monitoring the fault evolution continuously over time under any transient operating condition; - Speed/slip measurement or estimation is not required; - Higher accuracy in filtering frequency components around the fundamental in case of rotor faults; - Reduction in the likelihood of false indications by avoiding confusion with other fault harmonics (the contribution of the most relevant fault frequency components under speed-varying conditions are clamped in a single frequency band); - Low memory requirement due to low sampling frequency; - Reduction in the latency of time processing (no requirement of repeated sampling operation).
Resumo:
Since its approval by FDA in 2001, capsule endoscopy revolutionized the study of small bowel. One of the main limitations of its diffusion has been the high cost. More recently, a new videocapsule system (OMOM CE) has been developed in China and obtained the CE mark. Its cost is approximately half that of other capsule systems. However, there are few studies regarding the clinical experience with this new videocapsule system and none of them has been performed in the western world. Among the limitations of capsule endoscopy, there is also one linked to the diagnostic yield. The rapid transit of the device in the proximal segments implies a high risk of false negatives; an indirect confirmation of this limit is offered by the poor ability to identify the papilla of Vater. In addition, recent studies show that in patients with obscure gastrointestinal bleeding, the negative outcome of capsule endoscopy is correlated to a significant risk of recurrence of anemia in the short term, as well as the presence of small bowel lesions documented by a second capsule endoscopy. It was recently approved the use of a new device called "CapsoCam" (CapsoVision, Inc. Saratoga) characterized by four side cameras that offer a panoramic view of 360 degrees, instead of the front to 160°. Two recent pilot studies showed comparable safety profiles and diagnostic yield with the more standardized capsule. Namely, side vision has made possible a clear visualization of the papilla in 70% of cases. The aim of our study is to evaluate the feasibility and diagnostic yield of these two new devices, which first may allow a reduction in costs. Moreover, their complementary use could lead to a recovery diagnostic in patients with false negative results in an initial investigation.
Resumo:
High-resolution microscopy techniques provide a plethora of information on biological structures from the cellular level down to the molecular level. In this review, we present the unique capabilities of transmission electron and atomic force microscopy to assess the structure, oligomeric state, function and dynamics of channel and transport proteins in their native environment, the lipid bilayer. Most importantly, membrane proteins can be visualized in the frozen-hydrated state and in buffer solution by cryo-transmission electron and atomic force microscopy, respectively. We also illustrate the potential of the scintillation proximity assay to study substrate binding of detergent-solubilized transporters prior to crystallization and structural characterization.
Resumo:
PURPOSE: Two noninvasive methods to measure dental implant stability are damping capacity assessment (Periotest) and resonance frequency analysis (Osstell). The objective of the present study was to assess the correlation of these 2 techniques in clinical use. MATERIALS AND METHODS: Implant stability of 213 clinically stable loaded and unloaded 1-stage implants in 65 patients was measured in triplicate by means of resonance frequency analysis and Periotest. Descriptive statistics as well as Pearson's, Spearman's, and intraclass correlation coefficients were calculated with SPSS 11.0.2. RESULTS: The mean values were 57.66 +/- 8.19 implant stability quotient for the resonance frequency analysis and -5.08 +/- 2.02 for the Periotest. The correlation of both measuring techniques was -0.64 (Pearson) and -0.65 (Spearman). The single-measure intraclass correlation coefficients for the ISQ and Periotest values were 0.99 and 0.88, respectively (95% CI). No significant correlation of implant length with either resonance frequency analysis or Periotest could be found. However, a significant correlation of implant diameter with both techniques was found (P < .005). The correlation of both measuring systems is moderate to good. It seems that the Periotest is more susceptible to clinical measurement variables than the Osstell device. The intraclass correlation indicated lower measurement precision for the Periotest technique. Additionally, the Periotest values differed more from the normal (Gaussian) curve of distribution than the ISQs. Both measurement techniques show a significant correlation to the implant diameter. CONCLUSION: Resonance frequency analysis appeared to be the more precise technique.