503 resultados para algorithmic skeletons


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the new world of work, workers not only change jobs more frequently, but also perform independent work on online labor markets. As they accomplish smaller and shorter jobs at the boundaries of organizations, employment relationships become unstable and career trajectories less linear. These new working conditions question the validity of existing management theories and call for more studies explaining gig workers’ behavior. Aim of this dissertation is contributing to this emerging body of knowledge by (I) exploring how gig workers shape their work identity on online platforms, and (II) investigating how algorithmic reputation changes dynamics of quality signaling and affects gig workers’ behavior. Chapter 1 introduces the debate on gig work, detailing why existing theories and definitions cannot be applied to this emergent workforce. Chapter 2 provides a systematic review of studies on individual work in online labor markets and identifies areas for future research. Chapter 3 describes the exploratory, qualitative methodology applied to collect and analyze data. Chapter 4 presents the first empirical paper investigating how the process of work identity construction unfolds for gig workers. It explores how digital platforms, intended both as providers of technological features and online environments, affect this process. Findings reveal the online environment constrains the action of workers who are pushed to take advantage of platform’s technological features to succeed. This interplay leads workers to develop an entrepreneurial orientation. Drawing on signaling theory, Chapter 5 understands how gig workers interpret algorithmic calculated reputation and with what consequences for their experience. Results show that, after complying to platform’s rules in the first period, freelancers respond to algorithmic management through different strategies – i.e. manipulation, nurturing relationships, and living with it. Although reputation scores standardize information on freelancers’ quality, and, apparently, freelancers’ work, this study shows instead responses to algorithmic control can be diverse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies how commercial practice is developing with artificial intelligence (AI) technologies and discusses some normative concepts in EU consumer law. The author analyses the phenomenon of 'algorithmic business', which defines the increasing use of data-driven AI in marketing organisations for the optimisation of a range of consumer-related tasks. The phenomenon is orienting business-consumer relations towards some general trends that influence power and behaviors of consumers. These developments are not taking place in a legal vacuum, but against the background of a normative system aimed at maintaining fairness and balance in market transactions. The author assesses current developments in commercial practices in the context of EU consumer law, which is specifically aimed at regulating commercial practices. The analysis is critical by design and without neglecting concrete practices tries to look at the big picture. The thesis consists of nine chapters divided in three thematic parts. The first part discusses the deployment of AI in marketing organisations, a brief history, the technical foundations, and their modes of integration in business organisations. In the second part, a selected number of socio-technical developments in commercial practice are analysed. The following are addressed: the monitoring and analysis of consumers’ behaviour based on data; the personalisation of commercial offers and customer experience; the use of information on consumers’ psychology and emotions, the mediation through marketing conversational applications. The third part assesses these developments in the context of EU consumer law and of the broader policy debate concerning consumer protection in the algorithmic society. In particular, two normative concepts underlying the EU fairness standard are analysed: manipulation, as a substantive regulatory standard that limits commercial behaviours in order to protect consumers’ informed and free choices and vulnerability, as a concept of social policy that portrays people who are more exposed to marketing practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bioassay-directed fractionation of the ethanol extracts of two Amphimedon spp. collected during trawling operations in the Great Australian Eight yielded four new macrocyclic lactone/lactams, amphilactams A-D (1-4). The amphilactams possess potent in vitro nematocidal properties, and their structures were assigned on the basis of detailed spectroscopic analysis and comparison with synthetic model compounds. The amphilactams feature both carbon skeletons and an enamino lactone/lactam moiety unprecedented in the natural products literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary purpose of this study was to estimate the magnitude and variability of peak calcium accretion rates in the skeletons of healthy white adolescents. Total-body bone mineral content (BMC) was measured annually on six occasions by dual-energy X-ray absorptiometry (DXA; Hologic 2000, array mode), a BMC velocity curve was generated for each child by a cubic spline fit, and peak accretion rates were determined. Anthropometric measures were collected every 6 months and a 24-h dietary recall was recorded two to three times per year. Of the 113 boys and 115 girls initially enrolled in the study, 60 boys and 53 girls who had peak height velocity (PHV) and peak BMC velocity values were used in this longitudinal analysis. When the individual BR IC velocity curves were aligned on the age of peak bone mineral velocity, the resulting mean peak bone mineral accrual rate was 407 g/year for boys (SD, 92 g/year; range, 226-651 g/year) and 322 g/year for girls (SD, 66 g/year; range, 194-520 g/year). Using 32.2% as the fraction of calcium in bone mineral, as determined by neutron activation analysis (Ellis et al., J Bone Miner Res 1996;11:843-848), these corresponded to peak calcium accretion rates of 359 mg/day for boys (81 mg/day; 199-574 mg/day) and 284 mg/day for girls (58 mg/day; 171-459 mg/day). These longitudinal results are 27-34% higher than our previous cross-sectional analysis in which we reported mean values of 282 mg/day for boys and 212 mg/day for girls (Martin et al., Am J Clin Nutr 1997;66:611-615). Mean age of peak calcium accretion was 14.0 years for the boys (1.0 years; 12.0-15.9 years), and 12.5 years for the girls (0.9 years; 10.5-14.6 years). Dietary calcium intake, determined as the mean of all assessments up to the age of peak accretion was 1140 mg/day (SD, 392 mg/day) for boys and 1113 mg/day (SD, 378 mg/day) for girls. We estimate that 26% of adult calcium is laid down during the 2 adolescent years of peak skeletal growth. This period of rapid growth requires high accretion rates of calcium, achieved in part by increased retention efficiency of dietary calcium.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Colonius suggests that, in using standard set theory as the language in which to express our computational-level theory of human memory, we would need to violate the axiom of foundation in order to express meaningful memory bindings in which a context is identical to an item in the list. We circumvent Colonius's objection by allowing that a list item may serve as a label for a context without being identical to that context. This debate serves to highlight the value of specifying memory operations in set theoretic notation, as it would have been difficult if not impossible to formulate such an objection at the algorithmic level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using the method of quantum trajectories we show that a known pure state can be optimally monitored through time when subject to a sequence of discrete measurements. By modifying the way that we extract information from the measurement apparatus we can minimize the average algorithmic information of the measurement record, without changing the unconditional evolution of the measured system. We define an optimal measurement scheme as one which has the lowest average algorithmic information allowed. We also show how it is possible to extract information about system operator averages from the measurement records and their probabilities. The optimal measurement scheme, in the limit of weak coupling, determines the statistics of the variance of the measured variable directly. We discuss the relevance of such measurements for recent experiments in quantum optics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Embryonic development of tendons is in close association with that of cartilage and bone. Although these tissues are derived from mesenchymal progenitor cells which also give rise to muscle and fat, their fates clearly diverse in early embryonic stages, Transcription factors may play pivotal roles in the process of determination and differentiation of tendon cells as well as other cells in the skeletal system. Scleraxis, a basic helix-loop-helix (bHLH) type transcription factor. is expressed in mesenchymal progenitors that later form connective tissues including tendons. Sox9 is an HMG-box containing transcription factor, which is expressed at high levels in chondrocytes. We hypothesized that the two transcription factors regulate the fate of cells that interact with each other at the interface between the two tissues during divergence of their differentiation pathways, To address this point, we investigated scleraxis and Sox9 rnRNA expression during mouse embyogenesis focusing on the coordinated development of tendons and skeletons, In the early stage of mesenchymal tissue development at 10.5 d.p.c., scleraxis and Sox9 transcripts were expressed in the mesenchymal progenitor cells in the appendicular and axial mesenchyme. At 11.5 d.p.c.. scleraxis transcripts were observed in the mesenchymal tissue surrounding skeletal primordia which express Sox9. From this stage, scleraxis expression was closely associated with, but distinct from, formation of skeletal primordia, At 13.5 d.p.c., scleraxis was expressed broadly in the interface between muscle and skeletal primordia while Sox9 expression is confined within the early skeletal primordia. Then. at 15.5 d.p.c., scleraxis transcripts were more restricted to tendons. These observations revealed the presence of temporal and spatial association of scleraxis expression during embryonic development of tendon precursor cells in close association with that of So,0 expression in chondrogenic cells in skeletal tissues. (C) 2002 Orthopaedic Research Society. Published by Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most finite element packages use the Newmark algorithm for time integration of structural dynamics. Various algorithms have been proposed to better optimize the high frequency dissipation of this algorithm. Hulbert and Chung proposed both implicit and explicit forms of the generalized alpha method. The algorithms optimize high frequency dissipation effectively, and despite recent work on algorithms that possess momentum conserving/energy dissipative properties in a non-linear context, the generalized alpha method remains an efficient way to solve many problems, especially with adaptive timestep control. However, the implicit and explicit algorithms use incompatible parameter sets and cannot be used together in a spatial partition, whereas this can be done for the Newmark algorithm, as Hughes and Liu demonstrated, and for the HHT-alpha algorithm developed from it. The present paper shows that the explicit generalized alpha method can be rewritten so that it becomes compatible with the implicit form. All four algorithmic parameters can be matched between the explicit and implicit forms. An element interface between implicit and explicit partitions can then be used, analogous to that devised by Hughes and Liu to extend the Newmark method. The stability of the explicit/implicit algorithm is examined in a linear context and found to exceed that of the explicit partition. The element partition is significantly less dissipative of intermediate frequencies than one using the HHT-alpha method. The explicit algorithm can also be rewritten so that the discrete equation of motion evaluates forces from displacements and velocities found at the predicted mid-point of a cycle. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, software architecture emerged as a critical issue in Software Engineering. This encompassed a shift from traditional programming towards software development based on the deployment and assembly of independent components. The specification of both the overall systems structure and the interaction patterns between their components became a major concern for the working developer. Although a number of formalisms to express behaviour and to supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from CCS behavioural specifications the corresponding architectural skeletons in the Microsoft .Net framework, in the form of executable C and Cω code. The prototyping process is fully supported by a specific tool developed in Haskell

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software architecture is currently recognized as one of the most critical design steps in Software Engineering. The specification of the overall system structure, on the one hand, and of the interactions patterns between its components, on the other, became a major concern for the working developer. Although a number of formalisms to express behaviour and supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from behavioural specifications written in Cw, the corresponding architectural skeletons in the Microsoft .NET framework in the form of executable code

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, software architecture emerged as a critical design step in Software Engineering. This encompassed a shift from traditional programming towards the deployment and assembly of independent components. The specification of the overall system structure, on the one hand, and of the interactions patterns between its components, on the other, became a major concern for the working developer. Although a number of formalisms to express behaviour and supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from behavioural specifications written in Ccs, the corresponding architectural skeletons in the Microsoft .Net framework in the form of executable C] code. Such prototyping process is automated by means of a specific tool developed in Haskell

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a brief history of the western music: from its genesis to serialism and the Darmstadt school. Also some mathematical aspects of music are then presented and confronted with music as a form of art. The question is, are these two distinct aspects compatible? Can computers be of real help in automatic composition? The more appealing algorithmic approach is evolutionary computation as it offers creativity potential. Therefore, the Evolutionary Algorithms are then introduced and some results of GAs and GPs application to music generation are analysed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Com o crescimento da informação disponível na Web, arquivos pessoais e profissionais, protagonizado tanto pelo aumento da capacidade de armazenamento de dados, como pelo aumento exponencial da capacidade de processamento dos computadores, e do fácil acesso a essa mesma informação, um enorme fluxo de produção e distribuição de conteúdos audiovisuais foi gerado. No entanto, e apesar de existirem mecanismos para a indexação desses conteúdos com o objectivo de permitir a pesquisa e acesso aos mesmos, estes apresentam normalmente uma grande complexidade algorítmica ou exigem a contratação de pessoal altamente qualificado, para a verificação e categorização dos conteúdos. Nesta dissertação pretende-se estudar soluções de anotação colaborativa de conteúdos e desenvolver uma ferramenta que facilite a anotação de um arquivo de conteúdos audiovisuais. A abordagem implementada é baseada no conceito dos “Jogos com Propósito” (GWAP – Game With a Purpose) e permite que os utilizadores criem tags (metadatos na forma de palavras-chave) de forma a atribuir um significado a um objecto a ser categorizado. Assim, e como primeiro objectivo, foi desenvolvido um jogo com o propósito não só de entretenimento, mas também que permita a criação de anotações audiovisuais perante os vídeos que são apresentados ao jogador e, que desta forma, se melhore a indexação e categorização dos mesmos. A aplicação desenvolvida permite ainda a visualização dos conteúdos e metadatos categorizados, e com o objectivo de criação de mais um elemento informativo, permite a inserção de um like num determinado instante de tempo do vídeo. A grande vantagem da aplicação desenvolvida reside no facto de adicionar anotações a pontos específicos do vídeo, mais concretamente aos seus instantes de tempo. Trata-se de uma funcionalidade nova, não disponível em outras aplicações de anotação colaborativa de conteúdos audiovisuais. Com isto, o acesso aos conteúdos será bastante mais eficaz pois será possível aceder, por pesquisa, a pontos específicos no interior de um vídeo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

15th International Conference on Mixed Design of Integrated Circuits and Systems, pp. 177 – 180, Poznan, Polónia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Locating and identifying points as global minimizers is, in general, a hard and time-consuming task. Difficulties increase in the impossibility of using the derivatives of the functions defining the problem. In this work, we propose a new class of methods suited for global derivative-free constrained optimization. Using direct search of directional type, the algorithm alternates between a search step, where potentially good regions are located, and a poll step where the previously located promising regions are explored. This exploitation is made through the launching of several instances of directional direct searches, one in each of the regions of interest. Differently from a simple multistart strategy, direct searches will merge when sufficiently close. The goal is to end with as many direct searches as the number of local minimizers, which would easily allow locating the global extreme value. We describe the algorithmic structure considered, present the corresponding convergence analysis and report numerical results, showing that the proposed method is competitive with currently commonly used global derivative-free optimization solvers.