953 resultados para Dynamic Capabilities


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Operational capabilities são caracterizadas como um recurso interno da firma e fonte de vantagem competitiva. Porém, a literatura de estratégia de operações fornece uma definição constitutiva inadequada para as operational capabilities, desconsiderando a relativização dos diferentes contextos, a limitação da base empírica, e não explorando adequadamente a extensa literatura sobre práticas operacionais. Quando as práticas operacionais são operacionalizadas no ambiente interno da firma, elas podem ser incorporadas as rotinas organizacionais, e através do conhecimento tácito da produção se transformar em operational capabilities, criando assim barreiras à imitação. Apesar disso, poucos são os pesquisadores que exploram as práticas operacionais como antecedentes das operational capabilities. Baseado na revisão da literatura, nós investigamos a natureza das operational capabilities; a relação entre práticas operacionais e operational capabilities; os tipos de operational capabilities que são caracterizadas no ambiente interno da firma; e o impacto das operational capabilities no desempenho operacional. Nós conduzimos uma pesquisa de método misto. Na etapa qualitativa, nós conduzimos estudos de casos múltiplos com quatro firmas, duas multinacionais americanas que operam no Brasil, e duas firmas brasileiras. Nós coletamos os dados através de entrevistas semi-estruturadas com questões semi-abertas. Elas foram baseadas na revisão da literatura sobre práticas operacionais e operational capabilities. As entrevistas foram conduzidas pessoalmente. No total 73 entrevistas foram realizadas (21 no primeiro caso, 18 no segundo caso, 18 no terceiro caso, e 16 no quarto caso). Todas as entrevistas foram gravadas e transcritas literalmente. Nós usamos o sotware NVivo. Na etapa quantitativa, nossa amostra foi composta por 206 firmas. O questionário foi criado a partir de uma extensa revisão da literatura e também a partir dos resultados da fase qualitativa. O método Q-sort foi realizado. Um pré-teste foi conduzido com gerentes de produção. Foram realizadas medidas para reduzir Variância de Método Comum. No total dez escalas foram utilizadas. 1) Melhoria Contínua; 2) Gerenciamento da Informação; 3) Aprendizagem; 4) Suporte ao Cliente; 5) Inovação; 6) Eficiência Operacional; 7) Flexibilidade; 8) Customização; 9) Gerenciamento dos Fornecedores; e 10) Desempenho Operacional. Nós usamos análise fatorial confirmatória para confirmar a validade de confiabilidade, conteúdo, convergente, e discriminante. Os dados foram analisados com o uso de regressões múltiplas. Nossos principais resultados foram: Primeiro, a relação das práticas operacionais como antecedentes das operational capabilities. Segundo, a criação de uma tipologia dividida em dois construtos. O primeiro construto foi chamado de Standalone Capabilities. O grupo consiste de zero order capabilities tais como Suporte ao Cliente, Inovação, Eficiência Operacional, Flexibilidade, e Gerenciamento dos Fornecedores. Estas operational capabilities têm por objetivo melhorar os processos da firma. Elas têm uma relação direta com desempenho operacional. O segundo construto foi chamado de Across-the-Board Capabilities. Ele é composto por first order capabilities tais como Aprendizagem Contínua e Gerenciamento da Informação. Estas operational capabilities são consideradas dinâmicas e possuem o papel de reconfigurar as Standalone Capabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary purpose of this thesis was to design and develop a prototype e-commerce system where dynamic parameters are included in the decision-making process and execution of an online transaction. The system developed and implemented takes into account previous usage history, priority and associated engineering capabilities. The system was developed using three-tiered client server architecture. The interface was the Internet browser. The middle tiered web server was implemented using Active Server Pages, which form a link between the client system and other servers. A relational database management system formed the data component of the three-tiered architecture. It includes a capability for data warehousing which extracts needed information from the stored data of the customers as well as their orders. The system organizes and analyzes the data that is generated during a transaction to formulate a client's behavior model during and after a transaction. This is used for making decisions like pricing, order rescheduling during a client's forthcoming transaction. The system helps among other things to bring about predictability to a transaction execution process, which could be highly desirable in the current competitive scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Android is becoming ubiquitous and currently has the largest share of the mobile OS market with billions of application downloads from the official app market. It has also become the platform most targeted by mobile malware that are becoming more sophisticated to evade state-of-the-art detection approaches. Many Android malware families employ obfuscation techniques in order to avoid detection and this may defeat static analysis based approaches. Dynamic analysis on the other hand may be used to overcome this limitation. Hence in this paper we propose DynaLog, a dynamic analysis based framework for characterizing Android applications. The framework provides the capability to analyse the behaviour of applications based on an extensive number of dynamic features. It provides an automated platform for mass analysis and characterization of apps that is useful for quickly identifying and isolating malicious applications. The DynaLog framework leverages existing open source tools to extract and log high level behaviours, API calls, and critical events that can be used to explore the characteristics of an application, thus providing an extensible dynamic analysis platform for detecting Android malware. DynaLog is evaluated using real malware samples and clean applications demonstrating its capabilities for effective analysis and detection of malicious applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generation of functional, vascularized tissues is a key challenge for the field of tissue engineering. Before clinical implantations of tissue engineered bone constructs can succeed, in vitro fabrication needs to address limitations in large-scale tissue development, including controlled osteogenesis and an inadequate vasculature network to prevent necrosis of large constructs. The tubular perfusion system (TPS) bioreactor is an effective culturing method to augment osteogenic differentiation and maintain viability of human mesenchymal stem cell (hMSC)-seeded scaffolds while they are developed in vitro. To further enhance this process, we developed a novel osteogenic growth factors delivery system for dynamically cultured hMSCs using microparticles encapsulated in three-dimensional alginate scaffolds. In light of this increased differentiation, we characterized the endogenous cytokine distribution throughout the TPS bioreactor. An advantageous effect in the ‘outlet’ portion of the uniaxial growth chamber was discovered due to the system’s downstream circulation and the unique modular aspect of the scaffolds. This unique trait allowed us to carefully tune the differentiation behavior of specific cell populations. We applied the knowledge gained from the growth profile of the TPS bioreactor to culture a high-volume bone composite in a 3D-printed femur mold. This resulted in a tissue engineered bone construct with a volume of 200cm3, a 20-fold increase over previously reported sizes. We demonstrated high viability of the cultured cells throughout the culture period as well as early signs of osteogenic differentiation. Taking one step closer toward a viable implant and minimize tissue necrosis after implantation, we designed a composite construct by coculturing endothelial cells (ECs) and differentiating hMSCs, encouraging prevascularization and anastomosis of the graft with the host vasculature. We discovered the necessity of cell to cell proximity between the two cell types as well as preference for the natural cell binding capabilities of hydrogels like collagen. Notably, the results suggested increased osteogenic and angiogenic potential of the encapsulated cells when dynamically cultured in the TPS bioreactor, suggesting a synergistic effect between coculture and applied shear stress. This work highlights the feasibility of fabricating a high-volume, prevascularized tissue engineered bone construct for the regeneration of a critical size defect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Robots are ever increasing in a variety of different workplaces providing an array of benefits such alternative solutions to traditional human labor. While developing fully autonomous robots is the ultimate goal in many robotic applications the reality is that there still exist many situationswere robots require some level of teleoperation in order to achieve assigned goals especially when deployed in non-deterministic environments. For instance teleoperation is commonly used in areas such as search and rescue, bomb disposal and exploration of inaccessible or harsh terrain. This is due to a range of factors such as the lack of ability for robots to quickly and reliably navigate unknown environments or provide high-level decision making especially intime critical tasks. To provide an adequate solution for such situations human-in-the-loop control is required. When developing human-in-the-loop control it is important to take advantage of the complimentary skill-sets that both humans and robots share. For example robots can performrapid calculations, provide accurate measurements through hardware such as sensors and store large amounts of data while humans provide experience, intuition, risk management and complex decision making capabilities. Shared autonomy is the concept of building robotic systems that take advantage of these complementary skills-sets to provide a robust an efficient robotic solution. While the requirement of human-in-the-loop control exists Human Machine Interaction (HMI) remains an important research topic especially the area of User Interface (UI) design.In order to provide operators with an effective teleoperation system it is important that the interface is intuitive and dynamic while also achieving a high level of immersion. Recent advancements in virtual and augmented reality hardware is giving rise to innovative HMI systems. Interactive hardware such as Microsoft Kinect, leap motion, Oculus Rift, Samsung Gear VR and even CAVE Automatic Virtual Environments [1] are providing vast improvements over traditional user interface designs such as the experimental web browser JanusVR [2]. This combined with the introduction of standardized robot frameworks such as ROS and Webots [3] that now support a large number of different robots provides an opportunity to develop a universal UI for teleoperation control to improve operator efficiency while reducing teleoperation training.This research introduces the concept of a dynamic virtual workspace for teleoperation of heterogeneous robots in non-deterministic environments that require human-in-the-loop control. The system first identifies the connected robots through the use kinematic information then determines its network capabilities such as latency and bandwidth. Given the robot type and network capabilities the system can then provide the operator with available teleoperation modes such as pick and place control or waypoint navigation while also allowing them to manipulate the virtual workspace layout to provide information from onboard camera’s or sensors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability of agents and services to automatically locate and interact with unknown partners is a goal for both the semantic web and web services. This, \serendipitous interoperability", is hindered by the lack of an explicit means of describing what services (or agents) are able to do, that is, their capabilities. At present, informal descriptions of what services can do are found in \documentation" elements; or they are somehow encoded in operation names and signatures. We show, by ref- erence to existing service examples, how ambiguous and imprecise capa- bility descriptions hamper the attainment of automated interoperability goals in the open, global web environment. In this paper we propose a structured, machine readable description of capabilities, which may help to increase the recall and precision of service discovery mechanisms. Our capability description draws on previous work in capability and process modeling and allows the incorporation of external classi¯cation schemes. The capability description is presented as a conceptual meta model. The model supports conceptual queries and can be used as an extension to the DAML-S Service Pro¯le.