833 resultados para on-line DEMS
Resumo:
The purpose of this research is design considerations for environmental monitoring platforms for the detection of hazardous materials using System-on-a-Chip (SoC) design. Design considerations focus on improving key areas such as: (1) sampling methodology; (2) context awareness; and (3) sensor placement. These design considerations for environmental monitoring platforms using wireless sensor networks (WSN) is applied to the detection of methylmercury (MeHg) and environmental parameters affecting its formation (methylation) and deformation (demethylation). ^ The sampling methodology investigates a proof-of-concept for the monitoring of MeHg using three primary components: (1) chemical derivatization; (2) preconcentration using the purge-and-trap (P&T) method; and (3) sensing using Quartz Crystal Microbalance (QCM) sensors. This study focuses on the measurement of inorganic mercury (Hg) (e.g., Hg2+) and applies lessons learned to organic Hg (e.g., MeHg) detection. ^ Context awareness of a WSN and sampling strategies is enhanced by using spatial analysis techniques, namely geostatistical analysis (i.e., classical variography and ordinary point kriging), to help predict the phenomena of interest in unmonitored locations (i.e., locations without sensors). This aids in making more informed decisions on control of the WSN (e.g., communications strategy, power management, resource allocation, sampling rate and strategy, etc.). This methodology improves the precision of controllability by adding potentially significant information of unmonitored locations.^ There are two types of sensors that are investigated in this study for near-optimal placement in a WSN: (1) environmental (e.g., humidity, moisture, temperature, etc.) and (2) visual (e.g., camera) sensors. The near-optimal placement of environmental sensors is found utilizing a strategy which minimizes the variance of spatial analysis based on randomly chosen points representing the sensor locations. Spatial analysis is employed using geostatistical analysis and optimization occurs with Monte Carlo analysis. Visual sensor placement is accomplished for omnidirectional cameras operating in a WSN using an optimal placement metric (OPM) which is calculated for each grid point based on line-of-site (LOS) in a defined number of directions where known obstacles are taken into consideration. Optimal areas of camera placement are determined based on areas generating the largest OPMs. Statistical analysis is examined by using Monte Carlo analysis with varying number of obstacles and cameras in a defined space. ^
Resumo:
In this paper the characteristics of the cyclical political polarization of the Spanish media system are defined. From this study, a prospective analysis raises doubts about this scenario remains unchanged because of the political and economic crisis. It seeks to define the role played by political and media actors in polarization focusing on the two legislatures where the tension reached higher levels (1993-1996 and 2004-2008) and compares it with the developments faced by them in the current economical and political context of crisis. To achieve these aims, it has been performed an analysis of media content (since 1993) and looked through primary sociological sources and the scientific literature about polarization. This is an exploratory, critical and descriptive case analysis.
Resumo:
Advertising investment and audience figures indicate that television continues to lead as a mass advertising medium. However, its effectiveness is questioned due to problems such as zapping, saturation and audience fragmentation. This has favoured the development of non-conventional advertising formats. This study provides empirical evidence for the theoretical development. This investigation analyzes the recall generated by four non-conventional advertising formats in a real environment: short programme (branded content), television sponsorship, internal and external telepromotion versus the more conventional spot. The methodology employed has integrated secondary data with primary data from computer assisted telephone interviewing (CATI) were performed ad-hoc on a sample of 2000 individuals, aged 16 to 65, representative of the total television audience. Our findings show that non-conventional advertising formats are more effective at a cognitive level, as they generate higher levels of both unaided and aided recall, in all analyzed formats when compared to the spot.
Resumo:
[EN]In this paper an architecture for an estimator of short-term wind farm power is proposed. The estimator is made up of a Linear Machine classifier and a set of k Multilayer Perceptrons, training each one for a specific subspace of the input space. The splitting of the input dataset into the k clusters is done using a k-means technique, obtaining the equivalent Linear Machine classifier from the cluster centroids...
Resumo:
This dissertation investigates the acquisition of oblique relative clauses in L2 Spanish by English and Moroccan Arabic speakers in order to understand the role of previous linguistic knowledge and its interaction with Universal Grammar on the one hand, and the relationship between grammatical knowledge and its use in real-time, on the other hand. Three types of tasks were employed: an oral production task, an on-line self-paced grammaticality judgment task, and an on-line self-paced reading comprehension task. Results indicated that the acquisition of oblique relative clauses in Spanish is a problematic area for second language learners of intermediate proficiency in the language, regardless of their native language. In particular, this study has showed that, even when the learners’ native language shares the main properties of the L2, i.e., fronting of the obligatory preposition (Pied-Piping), there is still room for divergence, especially in production and timed grammatical intuitions. On the other hand, reaction time data have shown that L2 learners can and do converge at the level of sentence processing, showing exactly the same real-time effects for oblique relative clauses that native speakers had. Processing results demonstrated that native and non-native speakers alike are able to apply universal processing principles such as the Minimal Chain Principle (De Vincenzi, 1991) even when the L2 learners still have incomplete grammatical representations, a result that contradicts some of the predictions of the Shallow Structure Hypothesis (Clahsen & Felser, 2006). Results further suggest that the L2 processing and comprehension domains may be able to access some type of information that it is not yet available to other grammatical modules, probably because transfer of certain L1 properties occurs asymmetrically across linguistic domains. In addition, this study also explored the Null-Prep phenomenon in L2 Spanish, and proposed that Null-Prep is an interlanguage stage, fully available and accounted within UG, which intermediate L2 as well as first language learners go through in the development of pied-piping oblique relative clauses. It is hypothesized that this intermediate stage is the result of optionality of the obligatory preposition in the derivation, when it is not crucial for the meaning of the sentence, and when the DP is going to be in an A-bar position, so it can get default case. This optionality can be predicted by the Bottleneck Hypothesis (Slabakova, 2009c) if we consider that these prepositions are some sort of functional morphology. This study contributes to the field of SLA and L2 processing in various ways. First, it demonstrates that the grammatical representations may be dissociated from grammatical processing in the sense that L2 learners, unlike native speakers, can present unexpected asymmetries such as a convergent processing but divergent grammatical intuitions or production. This conclusion is only possible under the assumption of a modular language system. Finally, it contributes to the general debate of generative SLA since in argues for a fully UG-constrained interlanguage grammar.
Resumo:
Mercury is not an essential element for plant or animal life and it is a potential environmental toxic because of its tendency to form covalent bonds with organic molecules and the high stability of the Hg-C bond. Reports estimate a total mercury concentration in natural waters ranging from 0.2 to 100 ng L-1. Due to this fact, highly sensitive methods are required for direct determination of such extremely low levels. In this work, a rapid and simple method was developed for separation and preconcentration of mercury by flow injection solid phase extraction coupled with on-line chemical vapour generation electrothermal atomic absorption spectrometry. The system is based on chelating retention of the analyte onto the mini column filled with a mesoporous silica functionalized with 1,5 bis (di-2-pyridyl) methylene thiocarbohydrazide. The main aim of this work was to develop a precise and accurate method for the determination of the Hg. Under the optima conditions and 120 s preconcentration time, the detection limit obtained was 0.009 μg L-1, with RSDs 3.7 % for 0.2 μg L-1, 4.8 % for 1 μg L-1 and enrichment factor 4, Furthermore, the method proposed has permitted the determination of Hg with a reduction in the analysis time, the sample throughput was about 18 h-1, low consumption of reagents and sample volume. The method was applied to the determination of Hg in sea water and river water. For the quality control of the analytical performance and the validation of the newly developed method, the analysis of two certified samples, TMDA 54.4 Fortified Lake, and LGC6187 River sediment was addressed. The results showed good agreement with the certified values.
Resumo:
Electrical properties of polycrystalline gas sensors are analyzed by d.c. and a.c. measurements. d.c. electrical conductivity values compared with those obtained by admittance spectroscopy methods help to obtain a detailed 'on line' analysis of conductivity-modulated gas sensors. The electrical behaviour of grain boundaries is obtained and a new design of sensors can be achieved by enhancing the activity of surface states in the detecting operation. A Schottky barrier model is used to explain the grain boundary action under the presence of surrounding gases. The height of this barrier is a function of gas concentration due to the trapping of excess charge generated by gas adsorption at the interface. A comparison between this dependence, and a plot of the real and imaginary components of the admittance versus frequency at different gas concentrations, provides information on the different parameters that play a role in the conduction mechanisms. These methods have been applied to the design of a CO sensor based on tin oxide films for domestic purposes, the characteristics of which are presented.
Resumo:
Este estudio presenta un análisis exploratorio sobre la correlación entre la fortaleza institucional, las condiciones de paz, y el emprendimiento en una muestra de 23 departamentos en Colombia usando datos de 2014. Para llevar a cabo este objetivo se propusieron y construyeron tres índices siguiendo definiciones conceptuales seminales o estándares de evaluación internacional, a saber: 1) El Índice de Fortaleza Institucional, 2) El Índice de Construcción de Paz (construido a partir del índice de paz negativa y el índice de paz positiva) y 3) El Índice de Emprendimiento Productivo. Los resultados no muestran una correlación significativa entre todos los tres índices. Por un lado, existe una correlación significativa (p<0.05) entre los índices de fortaleza institucional y emprendimiento productivo. Por otro lado, existen correlaciones negativas no significativas entre los índices de paz positiva y fortaleza institucional, emprendimiento productivo y paz positiva y emprendimiento productivo y construcción de paz. En un segundo acercamiento, la población de los departamentos fue la variable con mayor número de correlaciones significativas (p<0.01) entre variables relacionadas con emprendimiento productivo, empleo, producto interno bruto, sofisticación industrial, innovación (patentes) y crimen. Finalmente, se discuten las conclusiones y las futuras investigaciones.
Resumo:
Principal Topic: It is well known that most new ventures suffer from a significant lack of resources, which increases the risk of failure (Shepherd, Douglas and Shanley, 2000) and makes it difficult to attract stakeholders and financing for the venture (Bhide & Stevenson, 1999). The Resource-Based View (RBV) (Barney, 1991; Wernerfelt, 1984) is a dominant theoretical base increasingly drawn on within Strategic Management. While theoretical contributions applying RBV in the domain of entrepreneurship can arguably be traced back to Penrose (1959), there has been renewed attention recently (e.g. Alvarez & Busenitz, 2001; Alvarez & Barney, 2004). This said, empirical work is in its infancy. In part, this may be due to a lack of well developed measuring instruments for testing ideas derived from RBV. The purpose of this study is to develop a measurement scales that can serve to assist such empirical investigations. In so doing we will try to overcome three deficiencies in current empirical measures used for the application of RBV to the entrepreneurship arena. First, measures for resource characteristics and configurations associated with typical competitive advantages found in entrepreneurial firms need to be developed. These include such things as alertness and industry knowledge (Kirzner, 1973), flexibility (Ebben & Johnson, 2005), strong networks (Lee et al., 2001) and within knowledge intensive contexts, unique technical expertise (Wiklund and Shepard, 2003). Second, the RBV has the important limitations of being relatively static and modelled on large, established firms. In that context, traditional RBV focuses on competitive advantages. However, newly established firms often face disadvantages, especially those associated with the liabilities of newness (Aldrich & Auster, 1986). It is therefore important in entrepreneurial contexts to expand to an investigation of responses to competitive disadvantage through an RBV lens. Conversely, recent research has suggested that resource constraints actually have a positive effect on firm growth and performance under some circumstances (e.g., George, 2005; Katila & Shane, 2005; Mishina et al., 2004; Mosakowski, 2002; cf. also Baker & Nelson, 2005). Third, current empirical applications of RBV measured levels or amounts of particular resources available to a firm. They infer that these resources deliver firms competitive advantage by establishing a relationship between these resource levels and performance (e.g. via regression on profitability). However, there is the opportunity to directly measure the characteristics of resource configurations that deliver competitive advantage, such as Barney´s well known VRIO (Valuable, Rare, Inimitable and Organized) framework (Barney, 1997). Key Propositions and Methods: The aim of our study is to develop and test scales for measuring resource advantages (and disadvantages) and inimitability for entrepreneurial firms. The study proceeds in three stages. The first stage developed our initial scales based on earlier literature. Where possible, we adapt scales based on previous work. The first block of the scales related to the level of resource advantages and disadvantages. Respondents were asked the degree to which each resource category represented an advantage or disadvantage relative to other businesses in their industry on a 5 point response scale: Major Disadvantage, Slight Disadvantage, No Advantage or Disadvantage, Slight Advantage and Major Advantage. Items were developed as follows. Network capabilities (3 items) were adapted from (Madsen, Alsos, Borch, Ljunggren & Brastad, 2006). Knowledge resources marketing expertise / customer service (3 items) and technical expertise (3 items) were adapted from Wiklund and Shepard (2003). flexibility (2 items), costs (4 items) were adapted from JIBS B97. New scales were developed for industry knowledge / alertness (3 items) and product / service advantages. The second block asked the respondent to nominate the most important resource advantage (and disadvantage) of the firm. For the advantage, they were then asked four questions to determine how easy it would be for other firms to imitate and/or substitute this resource on a 5 point likert scale. For the disadvantage, they were asked corresponding questions related to overcoming this disadvantage. The second stage involved two pre-tests of the instrument to refine the scales. The first was an on-line convenience sample of 38 respondents. The second pre-test was a telephone interview with a random sample of 31 Nascent firms and 47 Young firms (< 3 years in operation) generated using a PSED method of randomly calling households (Gartner et al. 2004). Several items were dropped or reworded based on the pre-tests. The third stage (currently in progress) is part of Wave 1 of CAUSEE (Nascent Firms) and FEDP (Young Firms), a PSED type study being conducted in Australia. The scales will be tested and analysed with a random sample of approximately 700 Nascent and Young firms respectively. In addition, a judgement sample of approximately 100 high potential businesses in each category will be included. Findings and Implications: The paper will report the results of the main study (stage 3 – currently data collection is in progress) will allow comparison of the level of resource advantage / disadvantage across various sub-groups of the population. Of particular interest will be a comparison of the high potential firms with the random sample. Based on the smaller pre-tests (N=38 and N=78) the factor structure of the items confirmed the distinctiveness of the constructs. The reliabilities are within an acceptable range: Cronbach alpha ranged from 0.701 to 0.927. The study will provide an opportunity for researchers to better operationalize RBV theory in studies within the domain of entrepreneurship. This is a fundamental requirement for the ability to test hypotheses derived from RBV in systematic, large scale research studies.
Resumo:
This is the lead article for an issue of M/C Journal on the theme ‘obsolete.’ It uses the history of the International Journal of Cultural Studies (of which the author has been editor since 1997) to investigate technological innovations and their scholarly implications in academic journal publishing; in particular the obsolescence of the print form. Print-based elements like cover-design, the running order of articles, special issues, refereeing and the reading experience are all rendered obsolete with the growth of online access to individual articles. The paper argues that individuation of reading choices may be accompanied by less welcome tendencies, such as a decline in collegiality, disciplinary innovation, and trust.