839 resultados para Context-aware computing and systems
Resumo:
The circumstances in Colombo, Sri Lanka, and in Belfast, Northern Ireland, which led to a) the generalization of luminescent PET (photoinduced electron transfer) sensing/switching as a design tool, b) the construction of a market-leading blood electrolyte analyzer and c) the invention of molecular logic-based computation as an experimental field, are delineated. Efforts to extend the philosophy of these approaches into issues of small object identification, nanometric mapping, animal visual perception and visual art are also outlined.
Resumo:
Most contemporary explanations of congressional leadership postulate a version of contextual theory that typically places greatest emphasis on the strength of party and downplays the personal skills of individual leaders. By analyzing the leadership of just three recent individuals—Gingrich, Hastert, and Lott—this essay demonstrates the extent to which these leaders' different styles, skills, and characteristics interacted with changing political contexts and strategic environments to impact political and policy outcomes. Context matters, but so does leadership skill. Most graphically, Gingrich—a rare transforming leader in Burns' typology—demonstrates the importance of the right person and the right conditions being in place at the same time and the ability of an individual imaginative leader to intervene exogenously to have a significant effect on policy outcomes. Yet the essay also demonstrates that even where leaders adopt more conventional transactional styles, as Hastert and Lott did, the skill and success with which they juggle political pressures emanating from different, often conflicting, contexts—skills in context—also matters.
Resumo:
In this work a mixed integer optimization linear programming (MILP) model was applied to mixed line rate (MLR) IP over WDM and IP over OTN over WDM (with and without OTN grooming) networks, with aim to reduce network energy consumption. Energy-aware and energy-aware & short-path routing techniques were used. Simulations were made based on a real network topology as well as on forecasts of traffic matrix based on statistical data from 2005 up to 2017. Energy aware routing optimization model on IPoWDM network, showed the lowest energy consumption along all years, and once compared with energy-aware & short-path routing, has led to an overall reduction in energy consumption up to 29%, expecting to save even more than shortest-path routing. © 2014 IEEE.
Resumo:
Power laws, also known as Pareto-like laws or Zipf-like laws, are commonly used to explain a variety of real world distinct phenomena, often described merely by the produced signals. In this paper, we study twelve cases, namely worldwide technological accidents, the annual revenue of America׳s largest private companies, the number of inhabitants in America׳s largest cities, the magnitude of earthquakes with minimum moment magnitude equal to 4, the total burned area in forest fires occurred in Portugal, the net worth of the richer people in America, the frequency of occurrence of words in the novel Ulysses, by James Joyce, the total number of deaths in worldwide terrorist attacks, the number of linking root domains of the top internet domains, the number of linking root domains of the top internet pages, the total number of human victims of tornadoes occurred in the U.S., and the number of inhabitants in the 60 most populated countries. The results demonstrate the emergence of statistical characteristics, very close to a power law behavior. Furthermore, the parametric characterization reveals complex relationships present at higher level of description.
Resumo:
This case study of curriculum at Dubai Women's College (DWC) examines perceptions of international educators who designed and implemented curriculum for female Emirati higher-educational students in the UAE, and sheds light on the complex social, cultural, and religious factors affecting educational practice. Participants were faculty and supervisors, mainly foreign nationals, while students at DWC are exclusively Emirati. Theories prominent in this study are: constructivist learning theory, trans formative curriculum theory, and sociological theory. Change and empowerment theory figure prominently in this study. Findings reveal this unique group of educators understand curriculum theory as a "contextualized" construct and argue that theory and practice must be viewed through an international lens of religious, cultural, and social contexts. As well, the study explores how mandated "standards" in education-in the form of the International English Language Testing System (IEL TS) and integrated, constructivist curriculum, as taught in the Higher Diploma Year 1 program-function as dual curricular emphases in this context. The study found that tensions among these dual emphases existed and were mediated through specific strategies, including the use of authentic texts to mirror the IEL TS examination during in-class activities, and the relevance of curricular tasks.
Resumo:
Adjustement is an ongoing process by which factors of reallocated to equalize their returns in different uses. Adjustment occurs though market mechanisms or intrafirm reallocation of resources as a result of changes in terms of trade, government policies, resource availability, technological change, etc. These changes alter production opportunities and production, transaction and information costs, and consequently modify production functions, organizational design, etc. In this paper we define adjustment (section 2); review empirical estimates of the extent of adjustment in Canada and abroad (section 3); review selected features of the trade policy and adjustment context of relevance for policy formulation among which: slow growth, a shift to services, a shift to the Pacific Rim, the internationalization of production, investment distribution communications the growing use of NTB's, changes in foreign direct investment patterns, intrafirm and intraindustry trade, interregional trade flows, differences in micro economic adjustment processes of adjustment as between subsidiaries and Canadian companies (section 4); examine methodologies and results of studies of the impact of trade liberalization on jobs (section 5); and review the R. Harris general equilibrium model (section 6). Our conclusion emphasizes the importance of harmonizing commercial and domestic policies dealing with adjustment (section 7). We close with a bibliography of relevant publications.
Resumo:
The photoacoustic investigations carried out on different photonic materials are presented in this thesis. Photonic materials selected for the investigation are tape cast ceramics, muItilayer dielectric coatings, organic dye doped PVA films and PMMA matrix doped with dye mixtures. The studies are performed by the measurement of photoacoustic signal generated as a result of modulated cw laser irradiation of samples. The gas-microphone scheme is employed for the detection of photoacoustic signal. The different measurements reported here reveal the adaptability and utility of the PA technique for the characterization of photonic materials.Ceramics find applications in the field of microelectronics industry. Tape cast ceramics are the building blocks of many electronic components and certain ceramic tapes are used as thermal barriers. The thermal parameters of these tapes will not be the same as that of thin films of the same materials. Parameters are influenced by the presence of foreign bodies in the matrix and the sample preparation technique. Measurements are done on ceramic tapes of Zirconia, Zirconia-Alumina combination, barium titanate, barium tin titanate, silicon carbide, lead zirconate titanateil'Z'T) and lead magnesium niobate titanate(PMNPT). Various configurations viz. heat reflection geometry and heat transmission geometry of the photoacoustic technique have been used for the evaluation of different thermal parameters of the sample. Heat reflection geometry of the PA cell has been used for the evaluation of thermal effusivity and heat transmission geometry has been made use of in the evaluation of thermal diffusivity. From the thermal diffusivity and thermal effusivity values, thermal conductivity is also calculated. The calculated values are nearly the same as the values reported for pure materials. This shows the feasibility of photoacoustic technique for the thermal characterization of ceramic tapes.Organic dyes find applications as holographic recording medium and as active media for laser operations. Knowledge of the photochemical stability of the material is essential if it has to be used tor any of these applications. Mixing one dye with another can change the properties of the resulting system. Through careful mixing of the dyes in appropriate proportions and incorporating them in polymer matrices, media of required stability can be prepared. Investigations are carried out on Rhodamine 6GRhodamine B mixture doped PMMA samples. Addition of RhB in small amounts is found to stabilize Rh6G against photodegradation and addition of Rh6G into RhB increases the photosensitivity of the latter. The PA technique has been successfully employed for the monitoring of dye mixture doped PMMA sample. The same technique has been used for the monitoring of photodegradation ofa laser dye, cresyl violet doped polyvinyl alcohol also.Another important application of photoacoustic technique is in nondestructive evaluation of layered samples. Depth profiling capability of PA technique has been used for the non-destructive testing of multilayer dielectric films, which are highly reflecting in the wavelength range selected for investigations. Eventhough calculation of thickness of the film is not possible, number of layers present in the system can be found out using PA technique. The phase plot has clear step like discontinuities, the number of which coincides with the number of layers present in the multilayer stack. This shows the sensitivity of PA signal phase to boundaries in a layered structure. This aspect of PA signal can be utilized in non-destructive depth profiling of reflecting samples and for the identification of defects in layered structures.
Resumo:
The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.
Resumo:
The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.
Resumo:
The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.
Resumo:
The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.
Resumo:
Memory errors are a common cause of incorrect software execution and security vulnerabilities. We have developed two new techniques that help software continue to execute successfully through memory errors: failure-oblivious computing and boundless memory blocks. The foundation of both techniques is a compiler that generates code that checks accesses via pointers to detect out of bounds accesses. Instead of terminating or throwing an exception, the generated code takes another action that keeps the program executing without memory corruption. Failure-oblivious code simply discards invalid writes and manufactures values to return for invalid reads, enabling the program to continue its normal execution path. Code that implements boundless memory blocks stores invalid writes away in a hash table to return as the values for corresponding out of bounds reads. he net effect is to (conceptually) give each allocated memory block unbounded size and to eliminate out of bounds accesses as a programming error. We have implemented both techniques and acquired several widely used open source servers (Apache, Sendmail, Pine, Mutt, and Midnight Commander).With standard compilers, all of these servers are vulnerable to buffer overflow attacks as documented at security tracking web sites. Both failure-oblivious computing and boundless memory blocks eliminate these security vulnerabilities (as well as other memory errors). Our results show that our compiler enables the servers to execute successfully through buffer overflow attacks to continue to correctly service user requests without security vulnerabilities.
Resumo:
Network connectivity is reaching more and more into the physical world. This is potentially transformative – allowing every object and service in the world to talk to one other—and to their users—through any networked interface; where online services are the connective tissue of the physical world and where physical objects are avatars of online services.