963 resultados para dynamic digital displays


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use the eclectic paradigm as an analytical framework to explain the MNE e-commerce company’s activities in China. Grounded in the rich data, we argue that the dynamic interplay between the ownership advantage and local institutional context that have emerged—particularly in the information age—plays a significant role in explaining the trajectory of MNE e-commerce companies in China. We propose On, Ln and In by embedding network-based advantages within the OLI paradigm. With the acceleration of technological change and non-ergodic uncertainty, such a network-embedded eclectic paradigm will lead to MNE e-commerce companies’ sustainable development in the emerging economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The highly dynamic nature of some sandy shores with continuous morphological changes require the development of efficient and accurate methodological strategies for coastal hazard assessment and morphodynamic characterisation. During the past decades, the general methodological approach for the establishment of coastal monitoring programmes was based on photogrammetry or classical geodetic techniques. With the advent of new geodetic techniques, space-based and airborne-based, new methodologies were introduced in coastal monitoring programmes. This paper describes the development of a monitoring prototype that is based on the use of global positioning system (GPS). The prototype has a GPS multiantenna mounted on a fast surveying platform, a land vehicle appropriate for driving in the sand (four-wheel quad). This system was conceived to perform a network of shore profiles in sandy shores stretches (subaerial beach) that extend for several kilometres from which high-precision digital elevation models can be generated. An analysis of the accuracy and precision of some differential GPS kinematic methodologies is presented. The development of an adequate survey methodology is the first step in morphodynamic shore characterisation or in coastal hazard assessment. The sample method and the computational interpolation procedures are important steps for producing reliable three-dimensional surface maps that are real as possible. The quality of several interpolation methods used to generate grids was tested in areas where there were data gaps. The results obtained allow us to conclude that with the developed survey methodology, it is possible to Survey sandy shores stretches, under spatial scales of kilometers, with a vertical accuracy of greater than 0.10 m in the final digital elevation models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deployment of low power basestations within cellular networks can potentially increase both capacity and coverage. However, such deployments require efficient resource allocation schemes for managing interference from the low power and macro basestations that are located within each other’s transmission range. In this dissertation, we propose novel and efficient dynamic resource allocation algorithms in the frequency, time and space domains. We show that the proposed algorithms perform better than the current state-of-art resource management algorithms. In the first part of the dissertation, we propose an interference management solution in the frequency domain. We introduce a distributed frequency allocation scheme that shares frequencies between macro and low power pico basestations, and guarantees a minimum average throughput to users. The scheme seeks to minimize the total number of frequencies needed to honor the minimum throughput requirements. We evaluate our scheme using detailed simulations and show that it performs on par with the centralized optimum allocation. Moreover, our proposed scheme outperforms a static frequency reuse scheme and the centralized optimal partitioning between the macro and picos. In the second part of the dissertation, we propose a time domain solution to the interference problem. We consider the problem of maximizing the alpha-fairness utility over heterogeneous wireless networks (HetNets) by jointly optimizing user association, wherein each user is associated to any one transmission point (TP) in the network, and activation fractions of all TPs. Activation fraction of a TP is the fraction of the frame duration for which it is active, and together these fractions influence the interference seen in the network. To address this joint optimization problem which we show is NP-hard, we propose an alternating optimization based approach wherein the activation fractions and the user association are optimized in an alternating manner. The subproblem of determining the optimal activation fractions is solved using a provably convergent auxiliary function method. On the other hand, the subproblem of determining the user association is solved via a simple combinatorial algorithm. Meaningful performance guarantees are derived in either case. Simulation results over a practical HetNet topology reveal the superior performance of the proposed algorithms and underscore the significant benefits of the joint optimization. In the final part of the dissertation, we propose a space domain solution to the interference problem. We consider the problem of maximizing system utility by optimizing over the set of user and TP pairs in each subframe, where each user can be served by multiple TPs. To address this optimization problem which is NP-hard, we propose a solution scheme based on difference of submodular function optimization approach. We evaluate our scheme using detailed simulations and show that it performs on par with a much more computationally demanding difference of convex function optimization scheme. Moreover, the proposed scheme performs within a reasonable percentage of the optimal solution. We further demonstrate the advantage of the proposed scheme by studying its performance with variation in different network topology parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today's fast-paced and interconnected digital world, the data generated by an increasing number of applications is being modeled as dynamic graphs. The graph structure encodes relationships among data items, while the structural changes to the graphs as well as the continuous stream of information produced by the entities in these graphs make them dynamic in nature. Examples include social networks where users post status updates, images, videos, etc.; phone call networks where nodes may send text messages or place phone calls; road traffic networks where the traffic behavior of the road segments changes constantly, and so on. There is a tremendous value in storing, managing, and analyzing such dynamic graphs and deriving meaningful insights in real-time. However, a majority of the work in graph analytics assumes a static setting, and there is a lack of systematic study of the various dynamic scenarios, the complexity they impose on the analysis tasks, and the challenges in building efficient systems that can support such tasks at a large scale. In this dissertation, I design a unified streaming graph data management framework, and develop prototype systems to support increasingly complex tasks on dynamic graphs. In the first part, I focus on the management and querying of distributed graph data. I develop a hybrid replication policy that monitors the read-write frequencies of the nodes to decide dynamically what data to replicate, and whether to do eager or lazy replication in order to minimize network communication and support low-latency querying. In the second part, I study parallel execution of continuous neighborhood-driven aggregates, where each node aggregates the information generated in its neighborhoods. I build my system around the notion of an aggregation overlay graph, a pre-compiled data structure that enables sharing of partial aggregates across different queries, and also allows partial pre-computation of the aggregates to minimize the query latencies and increase throughput. Finally, I extend the framework to support continuous detection and analysis of activity-based subgraphs, where subgraphs could be specified using both graph structure as well as activity conditions on the nodes. The query specification tasks in my system are expressed using a set of active structural primitives, which allows the query evaluator to use a set of novel optimization techniques, thereby achieving high throughput. Overall, in this dissertation, I define and investigate a set of novel tasks on dynamic graphs, design scalable optimization techniques, build prototype systems, and show the effectiveness of the proposed techniques through extensive evaluation using large-scale real and synthetic datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple but efficient voice activity detector based on the Hilbert transform and a dynamic threshold is presented to be used on the pre-processing of audio signals -- The algorithm to define the dynamic threshold is a modification of a convex combination found in literature -- This scheme allows the detection of prosodic and silence segments on a speech in presence of non-ideal conditions like a spectral overlapped noise -- The present work shows preliminary results over a database built with some political speech -- The tests were performed adding artificial noise to natural noises over the audio signals, and some algorithms are compared -- Results will be extrapolated to the field of adaptive filtering on monophonic signals and the analysis of speech pathologies on futures works

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focuses on design challenges caused by secondary impacts to printed wiring assemblies (PWAs) within hand-held electronics due to accidental drop or impact loading. The continuing increase of functionality, miniaturization and affordability has resulted in a decrease in the size and weight of handheld electronic products. As a result, PWAs have become thinner and the clearances between surrounding structures have decreased. The resulting increase in flexibility of the PWAs in combination with the reduced clearances requires new design rules to minimize and survive possible internal collisions impacts between PWAs and surrounding structures. Such collisions are being termed ‘secondary impact’ in this study. The effect of secondary impact on board-level drop reliability of printed wiring boards (PWBs) assembled with MEMS microphone components, is investigated using a combination of testing, response and stress analysis, and damage modeling. The response analysis is conducted using a combination of numerical finite element modeling and simplified analytic models for additional parametric sensitivity studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problems in subject access to information organization systems have been under investigation for a long time. Focusing on item-level information discovery and access, researchers have identified a range of subject access problems, including quality and application of metadata, as well as the complexity of user knowledge required for successful subject exploration. While aggregations of digital collections built in the United States and abroad generate collection-level metadata of various levels of granularity and richness, no research has yet focused on the role of collection-level metadata in user interaction with these aggregations. This dissertation research sought to bridge this gap by answering the question “How does collection-level metadata mediate scholarly subject access to aggregated digital collections?” This goal was achieved using three research methods: • in-depth comparative content analysis of collection-level metadata in three large-scale aggregations of cultural heritage digital collections: Opening History, American Memory, and The European Library • transaction log analysis of user interactions, with Opening History, and • interview and observation data on academic historians interacting with two aggregations: Opening History and American Memory. It was found that subject-based resource discovery is significantly influenced by collection-level metadata richness. The richness includes such components as: 1) describing collection’s subject matter with mutually-complementary values in different metadata fields, and 2) a variety of collection properties/characteristics encoded in the free-text Description field, including types and genres of objects in a digital collection, as well as topical, geographic and temporal coverage are the most consistently represented collection characteristics in free-text Description fields. Analysis of user interactions with aggregations of digital collections yields a number of interesting findings. Item-level user interactions were found to occur more often than collection-level interactions. Collection browse is initiated more often than search, while subject browse (topical and geographic) is used most often. Majority of collection search queries fall within FRBR Group 3 categories: object, concept, and place. Significantly more object, concept, and corporate body searches and less individual person, event and class of persons searches were observed in collection searches than in item searches. While collection search is most often satisfied by Description and/or Subjects collection metadata fields, it would not retrieve a significant proportion of collection records without controlled-vocabulary subject metadata (Temporal Coverage, Geographic Coverage, Subjects, and Objects), and free-text metadata (the Description field). Observation data shows that collection metadata records in Opening History and American Memory aggregations are often viewed. Transaction log data show a high level of engagement with collection metadata records in Opening History, with the total page views for collections more than 4 times greater than item page views. Scholars observed viewing collection records valued descriptive information on provenance, collection size, types of objects, subjects, geographic coverage, and temporal coverage information. They also considered the structured display of collection metadata in Opening History more useful than the alternative approach taken by other aggregations, such as American Memory, which displays only the free-text Description field to the end-user. The results extend the understanding of the value of collection-level subject metadata, particularly free-text metadata, for the scholarly users of aggregations of digital collections. The analysis of the collection metadata created by three large-scale aggregations provides a better understanding of collection-level metadata application patterns and suggests best practices. This dissertation is also the first empirical research contribution to test the FRBR model as a conceptual and analytic framework for studying collection-level subject access.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numa sociedade em que a globalização, as novas tecnologias, o social media, e as alterações sócio-culturais ditam as regras da nossa vida, o foco na diferenciação revela-se fundamental. O comportamento das empresas e dos consumidores tem sofrido alterações nos últimos anos, desde que a internet foi introduzida no seu quotidiano. As empresas passaram a abordar o mercado de forma diferente, o que por sua vez alterou de forma radical o modo como os consumidores interagem com estas, o que acrescentou uma nova dinâmica na relação de ambas as partes e consequentemente permitiu criar um processo interativo de aprendizagem mútua. Focada não somente nos conceitos teóricos da área de Marketing Digital, o presente relatório pretende caracterizar e analisar o processo de adoção desta vertente do marketing pela empresa Aleluia Cerâmicas, contribuindo no final com um conjunto de sugestões de melhoria futura que poderão proporcionar o aumento da satisfação dos clientes e o aumento da produtividade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation presents the design of three high-performance successive-approximation-register (SAR) analog-to-digital converters (ADCs) using distinct digital background calibration techniques under the framework of a generalized code-domain linear equalizer. These digital calibration techniques effectively and efficiently remove the static mismatch errors in the analog-to-digital (A/D) conversion. They enable aggressive scaling of the capacitive digital-to-analog converter (DAC), which also serves as sampling capacitor, to the kT/C limit. As a result, outstanding conversion linearity, high signal-to-noise ratio (SNR), high conversion speed, robustness, superb energy efficiency, and minimal chip-area are accomplished simultaneously. The first design is a 12-bit 22.5/45-MS/s SAR ADC in 0.13-μm CMOS process. It employs a perturbation-based calibration based on the superposition property of linear systems to digitally correct the capacitor mismatch error in the weighted DAC. With 3.0-mW power dissipation at a 1.2-V power supply and a 22.5-MS/s sample rate, it achieves a 71.1-dB signal-to-noise-plus-distortion ratio (SNDR), and a 94.6-dB spurious free dynamic range (SFDR). At Nyquist frequency, the conversion figure of merit (FoM) is 50.8 fJ/conversion step, the best FoM up to date (2010) for 12-bit ADCs. The SAR ADC core occupies 0.06 mm2, while the estimated area the calibration circuits is 0.03 mm2. The second proposed digital calibration technique is a bit-wise-correlation-based digital calibration. It utilizes the statistical independence of an injected pseudo-random signal and the input signal to correct the DAC mismatch in SAR ADCs. This idea is experimentally verified in a 12-bit 37-MS/s SAR ADC fabricated in 65-nm CMOS implemented by Pingli Huang. This prototype chip achieves a 70.23-dB peak SNDR and an 81.02-dB peak SFDR, while occupying 0.12-mm2 silicon area and dissipating 9.14 mW from a 1.2-V supply with the synthesized digital calibration circuits included. The third work is an 8-bit, 600-MS/s, 10-way time-interleaved SAR ADC array fabricated in 0.13-μm CMOS process. This work employs an adaptive digital equalization approach to calibrate both intra-channel nonlinearities and inter-channel mismatch errors. The prototype chip achieves 47.4-dB SNDR, 63.6-dB SFDR, less than 0.30-LSB differential nonlinearity (DNL), and less than 0.23-LSB integral nonlinearity (INL). The ADC array occupies an active area of 1.35 mm2 and dissipates 30.3 mW, including synthesized digital calibration circuits and an on-chip dual-loop delay-locked loop (DLL) for clock generation and synchronization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generation of functional, vascularized tissues is a key challenge for the field of tissue engineering. Before clinical implantations of tissue engineered bone constructs can succeed, in vitro fabrication needs to address limitations in large-scale tissue development, including controlled osteogenesis and an inadequate vasculature network to prevent necrosis of large constructs. The tubular perfusion system (TPS) bioreactor is an effective culturing method to augment osteogenic differentiation and maintain viability of human mesenchymal stem cell (hMSC)-seeded scaffolds while they are developed in vitro. To further enhance this process, we developed a novel osteogenic growth factors delivery system for dynamically cultured hMSCs using microparticles encapsulated in three-dimensional alginate scaffolds. In light of this increased differentiation, we characterized the endogenous cytokine distribution throughout the TPS bioreactor. An advantageous effect in the ‘outlet’ portion of the uniaxial growth chamber was discovered due to the system’s downstream circulation and the unique modular aspect of the scaffolds. This unique trait allowed us to carefully tune the differentiation behavior of specific cell populations. We applied the knowledge gained from the growth profile of the TPS bioreactor to culture a high-volume bone composite in a 3D-printed femur mold. This resulted in a tissue engineered bone construct with a volume of 200cm3, a 20-fold increase over previously reported sizes. We demonstrated high viability of the cultured cells throughout the culture period as well as early signs of osteogenic differentiation. Taking one step closer toward a viable implant and minimize tissue necrosis after implantation, we designed a composite construct by coculturing endothelial cells (ECs) and differentiating hMSCs, encouraging prevascularization and anastomosis of the graft with the host vasculature. We discovered the necessity of cell to cell proximity between the two cell types as well as preference for the natural cell binding capabilities of hydrogels like collagen. Notably, the results suggested increased osteogenic and angiogenic potential of the encapsulated cells when dynamically cultured in the TPS bioreactor, suggesting a synergistic effect between coculture and applied shear stress. This work highlights the feasibility of fabricating a high-volume, prevascularized tissue engineered bone construct for the regeneration of a critical size defect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is clear evidence that in typically developing children reasoning and sense-making are essential in all mathematical learning and understanding processes. In children with autism spectrum disorders (ASD), however, these become much more significant, considering their importance to successful independent living. This paper presents a preliminary proposal of a digital environment, specifically targeted to promote the development of mathematical reasoning in students with ASD. Given the diversity of ASD, the prototyping of this environment requires the study of dynamic adaptation processes and the development of activities adjusted to each user’s profile. We present the results obtained during the first phase of this ongoing research, describing a conceptual model of the proposed digital environment. Guidelines for future research are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A utilização das tecnologias é considerada um meio eficaz para trabalhar conteúdos académicos com alunos com Perturbações do Espetro do Autismo (PEA) possibilitando a criação de ambientes criativos e construtivos onde se podem desenvolver atividades diferenciadas, significativas e de qualidade. Contudo, o desenvolvimento de aplicações tecnológicas para crianças e jovens com PEA continua a merecer pouca atenção, nomeadamente no que respeita à promoção do raciocínio dedutivo, apesar desta ser uma área de grande interesse para indivíduos com esta perturbação. Para os alunos com PEA, o desenvolvimento do raciocínio matemático torna-se crucial, considerando a importância destas competências para o sucesso de uma vida autónoma. Estas evidências revelam o contributo inovador que o ambiente de aprendizagem descrito nesta comunicação poderá dar nesta área. O desenvolvimento deste ambiente começou por uma etapa de criação e validação de um modelo que permitiu especificar e prototipar a solução desenvolvida que oferece modalidades de adaptação dinâmica das atividades propostas ao perfil do utilizador, procurando promover o desenvolvimento do raciocínio matemático (indutivo e dedutivo). Considerando a heterogeneidade das PEA, o ambiente desenvolvido baseia-se em modalidades de adaptação dinâmica e em atividades ajustadas ao perfil dos utilizadores. Nesta comunicação procurámos dar a conhecer o trabalho de investigação já desenvolvido, bem como perspetivar a continuidade do trabalho a desenvolver.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines digital technologies policies designed for Australian schools and the ways they are understood and interpreted by students, school staff, teachers, principals and policy writers. This study explores the ways these research participant groups interpret and understand the ‘ethical dimension’ of schools’ digital technologies policies for teaching and learning. In this thesis the ethical dimension is considered to be a dynamic concept which encompasses various elements including; decisions, actions, values, issues, debates, education, discourses, and notions of right and wrong, in relation to ethics and uses of digital technologies in schools. In this study policy is taken to mean not only written texts but discursive processes, policy documents including national declarations, strategic plans and ‘acceptable use’ policies to guide the use of digital technologies in schools. The research is situated in the context of changes that have occurred in Australia and internationally over the last decade that have seen a greater focus on the access to and use of digital technologies in schools. In Australian school education, the attention placed on digital technologies in schools has seen the release of policies at the national, state, territory, education office and school levels, to guide their use. Prominent among these policies has been the Digital Education Revolution policy, launched in 2007 and concluded in 2013. This research aims to answers the question: What does an investigation reveal about understandings of the ethical dimension of digital technologies policies and their implementation in school education? The objective of this research is to examine the ethical dimension of digital technologies policies and to interpret and understand the responses of the research participants to the issues, silences, discourses and language, which characterise this dimension. In doing so, it is intended that the research can allow the participants to have a voice that, may be different to the official discourses located in digital technologies policies. The thesis takes a critical and interpretative approach to policies and examines the role of digital technologies policies as discourse. Interpretative theory is utilised as it provides a conceptual lens from which to interpret different perspectives and the implications of these in the construction of meaning in relation to schools’ digital technologies policies. Critical theory is used in tandem with interpretative theory as it represents a conceptual basis from which to critique and question underlying assumptions and discourses that are associated with the ethical dimension of schools’ digital technologies policies. The research methods used are semi-structured interviews and policy document analysis. Policies from the national, state, territory, education office and school level were analysed and contribute to understanding the way the ethical dimension of digital technologies policies is represented as a discourse. Students, school staff, teachers, principals and policy writers participated in research interviews and their views and perspectives were canvassed in relation to the ethical use of digital technologies and the policies that are designed to regulate their use. The thesis presents an argument that the ethical dimension of schools’ digital technologies policies and use is an under-researched area, and there are gaps in understanding and knowledge in the literature which remain to be addressed. It is envisaged that the thesis can make a meaningful contribution to understand the ways in which schools’ digital technologies policies are understood in school contexts. It is also envisaged that the findings from the research can inform policy development by analysing the voices and views of those in schools. The findings of the policy analysis revealed that there is little attention given to the ethical dimension in digital technologies at the national level. A discourse of compliance and control pervades digital technologies policies from the state, education office and school levels, which reduces ethical considerations to technical, legal and regulatory requirements. The discourse is largely instrumentalist and neglects the educative dimension of digital technologies which has the capacity to engender their ethical use. The findings from the interview conversations revealed that students, school staff and teachers perceive digital technologies policies to be difficult to understand, and not relevant to their situation and needs. They also expressed a desire to have greater consultation and participation in the formation and enactment of digital technologies policies, and they believe they are marginalised from these processes in their schools. Arising from the analysis of the policies and interview conversations, an argument is presented that in the light of the prominent role played by digital technologies and their potential for enhancing all aspects of school education, more research is required to provide a more holistic and richer understanding of the policies that are constructed to control and mediate their use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Direito, Programa de Pós-Graduação em Direito, 2016.