1000 resultados para Globalizing methods
Resumo:
Mixed methods research is becoming increasingly important in several scientific areas. The analysis of prevalence rates is a new line of research that has emerged in mixed methods research, and this methodological approach has only been applied carefully in a handful of journals. The purpose of this article was to analyse the prevalence of mixed methods research in interdisciplinary educational journals. Moreover, the main characteristics of the mixed methods articles identified were examined. This study used a mixed methods approach to analyse these aspects. Specifically, a partially mixed sequential equal status multiple-case study design was applied with a development mixed methods purpose. Three educational journals in different disciplines were reviewed from 2005 to 2010 (Academy of Management Learning and Education, Educational Psychology Review, Journal of the Learning Sciences). The findings show differences among the journals in the prevalence rates and characteristics of the mixed methods studies
Resumo:
Technology scaling has proceeded into dimensions in which the reliability of manufactured devices is becoming endangered. The reliability decrease is a consequence of physical limitations, relative increase of variations, and decreasing noise margins, among others. A promising solution for bringing the reliability of circuits back to a desired level is the use of design methods which introduce tolerance against possible faults in an integrated circuit. This thesis studies and presents fault tolerance methods for network-onchip (NoC) which is a design paradigm targeted for very large systems-onchip. In a NoC resources, such as processors and memories, are connected to a communication network; comparable to the Internet. Fault tolerance in such a system can be achieved at many abstraction levels. The thesis studies the origin of faults in modern technologies and explains the classification to transient, intermittent and permanent faults. A survey of fault tolerance methods is presented to demonstrate the diversity of available methods. Networks-on-chip are approached by exploring their main design choices: the selection of a topology, routing protocol, and flow control method. Fault tolerance methods for NoCs are studied at different layers of the OSI reference model. The data link layer provides a reliable communication link over a physical channel. Error control coding is an efficient fault tolerance method especially against transient faults at this abstraction level. Error control coding methods suitable for on-chip communication are studied and their implementations presented. Error control coding loses its effectiveness in the presence of intermittent and permanent faults. Therefore, other solutions against them are presented. The introduction of spare wires and split transmissions are shown to provide good tolerance against intermittent and permanent errors and their combination to error control coding is illustrated. At the network layer positioned above the data link layer, fault tolerance can be achieved with the design of fault tolerant network topologies and routing algorithms. Both of these approaches are presented in the thesis together with realizations in the both categories. The thesis concludes that an optimal fault tolerance solution contains carefully co-designed elements from different abstraction levels
Resumo:
Recent years have produced great advances in the instrumentation technology. The amount of available data has been increasing due to the simplicity, speed and accuracy of current spectroscopic instruments. Most of these data are, however, meaningless without a proper analysis. This has been one of the reasons for the overgrowing success of multivariate handling of such data. Industrial data is commonly not designed data; in other words, there is no exact experimental design, but rather the data have been collected as a routine procedure during an industrial process. This makes certain demands on the multivariate modeling, as the selection of samples and variables can have an enormous effect. Common approaches in the modeling of industrial data are PCA (principal component analysis) and PLS (projection to latent structures or partial least squares) but there are also other methods that should be considered. The more advanced methods include multi block modeling and nonlinear modeling. In this thesis it is shown that the results of data analysis vary according to the modeling approach used, thus making the selection of the modeling approach dependent on the purpose of the model. If the model is intended to provide accurate predictions, the approach should be different than in the case where the purpose of modeling is mostly to obtain information about the variables and the process. For industrial applicability it is essential that the methods are robust and sufficiently simple to apply. In this way the methods and the results can be compared and an approach selected that is suitable for the intended purpose. Differences in data analysis methods are compared with data from different fields of industry in this thesis. In the first two papers, the multi block method is considered for data originating from the oil and fertilizer industries. The results are compared to those from PLS and priority PLS. The third paper considers applicability of multivariate models to process control for a reactive crystallization process. In the fourth paper, nonlinear modeling is examined with a data set from the oil industry. The response has a nonlinear relation to the descriptor matrix, and the results are compared between linear modeling, polynomial PLS and nonlinear modeling using nonlinear score vectors.
Resumo:
The thermal stability of vegetable oils is an important factor that affects their quality. In this study, we investigated the thermal stability of oil and lecithin extracted from soybeans by two distinct processes: mechanical extraction (pressing) and physical extraction (solvent). Thermal analysis was used to obtain information about different methodologies of extraction. The physically extracted products proved more stable than those extracted mechanically. Raman and UV-Vis techniques were applied to underpin the discussion of process differences.
Resumo:
Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.
Resumo:
Strategic development of distribution networks plays a key role in the asset management in electricity distribution companies. Owing to the capital-intensive nature of the field and longspan operations of companies, the significance of a strategy is emphasised. A well-devised strategy combines awareness of challenges posed by the operating environment and the future targets of the distribution company. Economic regulation, ageing infrastructure, scarcity of resources and tightening supply requirements with challenges created by the climate change put a pressure on the strategy work. On the other hand, technology development related to network automation and underground cabling assists in answering these challenges. This dissertation aims at developing process knowledge and establishing a methodological framework by which key issues related to network development can be addressed. Moreover, the work develops tools by which the effects of changes in the operating environment on the distribution business can be analysed in the strategy work. To this end, the work discusses certain characteristics of the distribution business and describes the strategy process at a principle level. Further, the work defines the subtasks in the strategy process and presents the key elements in the strategy work and long-term network planning. The work delineates the factors having either a direct or indirect effect on strategic planning and development needs in the networks; in particular, outage costs constitute an important part of the economic regulation of the distribution business, reliability being thus a key driver in network planning. The dissertation describes the methodology and tools applied to cost and reliability analyses in the strategy work. The work focuses on determination of the techno-economic feasibility of different network development technologies; these feasibility surveys are linked to the economic regulation model of the distribution business, in particular from the viewpoint of reliability of electricity supply and allowed return. The work introduces the asset management system developed for research purposes and to support the strategy work, the calculation elements of the system and initial data used in the network analysis. The key elements of this asset management system are utilised in the dissertation. Finally, the study addresses the stages of strategic decision-making and compilation of investment strategies. Further, the work illustrates implementation of strategic planning in an actual distribution company environment.
Resumo:
Stratospheric ozone can be measured accurately using a limb scatter remote sensing technique at the UV-visible spectral region of solar light. The advantages of this technique includes a good vertical resolution and a good daytime coverage of the measurements. In addition to ozone, UV-visible limb scatter measurements contain information about NO2, NO3, OClO, BrO and aerosols. There are currently several satellite instruments continuously scanning the atmosphere and measuring the UVvisible region of the spectrum, e.g., the Optical Spectrograph and Infrared Imager System (OSIRIS) launched on the Odin satellite in February 2001, and the Scanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY (SCIAMACHY) launched on Envisat in March 2002. Envisat also carries the Global Ozone Monitoring by Occultation of Stars (GOMOS) instrument, which also measures limb-scattered sunlight under bright limb occultation conditions. These conditions occur during daytime occultation measurements. The global coverage of the satellite measurements is far better than any other ozone measurement technique, but still the measurements are sparse in the spatial domain. Measurements are also repeated relatively rarely over a certain area, and the composition of the Earth’s atmosphere changes dynamically. Assimilation methods are therefore needed in order to combine the information of the measurements with the atmospheric model. In recent years, the focus of assimilation algorithm research has turned towards filtering methods. The traditional Extended Kalman filter (EKF) method takes into account not only the uncertainty of the measurements, but also the uncertainty of the evolution model of the system. However, the computational cost of full blown EKF increases rapidly as the number of the model parameters increases. Therefore the EKF method cannot be applied directly to the stratospheric ozone assimilation problem. The work in this thesis is devoted to the development of inversion methods for satellite instruments and the development of assimilation methods used with atmospheric models.
Resumo:
BACKGROUND: The Cancer Fast-track Programme's aim was to reduce the time that elapsed between well-founded suspicion of breast, colorectal and lung cancer and the start of initial treatment in Catalonia (Spain). We sought to analyse its implementation and overall effectiveness. METHODS: A quantitative analysis of the programme was performed using data generated by the hospitals on the basis of seven fast-track monitoring indicators for the period 2006-2009. In addition, we conducted a qualitative study, based on 83 semistructured interviews with primary and specialised health professionals and health administrators, to obtain their perception of the programme's implementation. RESULTS: About half of all new patients with breast, lung or colorectal cancer were diagnosed via the fast track, though the cancer detection rate declined across the period. Mean time from detection of suspected cancer in primary care to start of initial treatment was 32 days for breast, 30 for colorectal and 37 for lung cancer (2009). Professionals associated with the implementation of the programme showed that general practitioners faced with suspicion of cancer had changed their conduct with the aim of preventing lags. Furthermore, hospitals were found to have pursued three specific implementation strategies (top-down, consensus-based and participatory), which made for the cohesion and sustainability of the circuits. CONCLUSION: The programme has contributed to speeding up diagnostic assessment and treatment of patients with suspicion of cancer, and to clarifying the patient pathway between primary and specialised care.
Resumo:
This article explores the possibilities offered by visual methods in the move towards inclusive research, reviewing some methodological implications of said research and reflecting on the potential of visual methods to meet these methodological requirements. A study into the impact of work on social inclusion and the social relationships of people suffering from severe mental illness (SMI) serves to illustrate the use of visual methods such as photo elicitation and graphic elicitation in the context of in-depth interviews with the aim of improving the aforementioned target group’s participation in research, participation understood as one of the basic elements of inclusive approaches. On the basis of this study, we reflect on the potential of visual methods to improve the inclusive approach to research and conclude that these methods are open and flexible in awarding participantsa voice, allowingpeople with SMI to express their needs, and therefore adding value to said approach
Resumo:
Different methods to determine total fat (TF) and fatty acids (FA), including trans fatty acids (TFA), in diverse foodstuffs were evaluated, incorporating gravimetric methods and gas chromatography with flame ionization detector (GC/FID), in accordance with a modified AOAC 996.06 method. Concentrations of TF and FA obtained through these different procedures diverged (p< 0.05) and TFA concentrations varied beyond 20 % of the reference values. The modified AOAC 996.06 method satisfied both accuracy and precision, was fast and employed small amounts of low toxicity solvents. Therefore, the results showed that this methodology is viable to be adopted in Brazil for nutritional labeling purposes.
Resumo:
A reversed-phase liquid chromatographic (LC) and ultraviolet (UV) spectrophotometric methods were developed and validated for the assay of bromopride in oral and injectable solutions. The methods were validated according to ICH guideline. Both methods were linear in the range between 5-25 μg mL-1 (y = 41837x - 5103.4, r = 0.9996 and y = 0.0284x - 0.0351, r = 1, respectively). The statistical analysis showed no significant difference between the results obtained by the two methods. The proposed methods were found to be simple, rapid, precise, accurate, and sensitive. The LC and UV methods can be used in the routine quantitative analysis of bromopride in oral and injectable solutions.
Resumo:
Monitoring of sewage sludge has proved the presence of many polar anthropogenic pollutants since LC/MS techniques came into routine use. While advanced techniques may improve characterizations, flawed sample processing procedures, however, may disturb or disguise the presence and fate of many target compounds present in this type of complex matrix before analytical process starts. Freeze-drying or oven-drying, in combination with centrifugation or filtration as sample processing techniques were performed followed by visual pattern recognition of target compounds for assessment of pretreatment processes. The results shown that oven-drying affected the sludge characterization, while freeze-drying led to less analytical misinterpretations.
Resumo:
Two food products (powders) were obtained by hot-air drying or lyophilisation methods on the whole guava fruits. The powders were characterised by sensory and thermal analyses (TGA-DSC), infrared spectroscopy (IR), X-ray diffraction (XRD) and scanning electron microscopy (SEM). Thermal, morphological and structural characterisations showed a similar behaviour for the two solids. TGA-DSC and IR showed the presence of pectin as the main constituent of solids. A semi-crystalline profile was evidenced by XRD, and lamellar/spherical morphologies were observed by SEM. Sensory analyses revealed an aroma highly related to guava. These value-added food products are an alternative to process guava and avoid loss during postharvest handling.
Resumo:
In this work we report the synthesis of sulfonamide derivatives using a conventional procedure and with solid supports, such as silica gel, florisil, alumina, 4Å molecular sieves, montmorillonite KSF, and montmorillonite K10 using solvent-free and microwave-assisted methods. Our results show that solid supports have a catalytic activity in the formation of sulfonamide derivatives. We found that florisil, montmorillonite KSF, and K10 could be used as inexpensive alternative catalysts that are easily separated from the reaction media. Additionally, solvent-free and microwave-assisted methods were more efficient in reducing reaction time and in increasing yield.
Resumo:
Three simple, sensitive, economical and reproducible spectrophotometric methods (A, B and C) are described for determination of mesalamine in pure drug as well as in tablet dosage forms. Method A is based on the reduction of tungstate and/or molybdate in Folin Ciocalteu's reagent; method B describes the reaction between the diazotized drug and α-naphthol and method C is based on the reaction of the drug with vanillin, in acidic medium. Under optimum conditions, mesalamine could be quantified in the concentration ranges, 1-30, 1-15 and 2-30 µg mL-1 by method A, B and C, respectively. All the methods have been applied to the determination of mesalamine in tablet dosage forms. Results of analysis are validated statistically.