350 resultados para regulating mammary gland function


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IT consumerization is both a major opportunity and significant challenge for organizations. However, IS research has hardly discussed the implications for IT management so far. In this paper we address this topic by empirically identifying organizational themes for IT consumerization and conceptually exploring the direct and indirect effects on the business value of IT, IT capabilities, and the IT function. More specifically, based on two case studies, we identify eight organizational themes: consumer IT strategy, policy development and responsibilities, consideration of private life of employees, user involvement into IT-related processes, individualization, updated IT infrastructure, end user support, and data and system security. The contributions of this paper are: (1) the identification of organizational themes for IT consumerization; (2) the proposed effects on the business value of IT, IT capabilities and the IT function, and; (3) combining empirical insights into IT consumerization with managerial theories in the IS discipline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project was a step forward in discovering the potential role of intestinal cell kinase in prostate cancer development. Intestinal cell kinase was shown to be upregulated in prostate cancer cells and altered expression led to changes in key cell survival proteins. This study used in vitro experiments to monitor changes in cell growth, protein and RNA expression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The purpose of this study was to adapt and validate the Foot Function Index to the Spanish (FFI-Sp) following the guidelines of the American Academy of Orthopaedic Surgeons. Methods A cross-sectional study 80 participants with some foot pathology. A statistical analysis was made, including a correlation study with other questionnaires (the Foot Health Status Questionnaire, EuroQol 5-D, Visual Analogue Pain Scale, and the Short Form SF-12 Health Survey). Data analysis included reliability, construct and criterion-related validity and factor analyses. Results The principal components analysis with varimax rotation produced 3 principal factors that explained 80% of the variance. The confirmatory factor analysis showed an acceptable fit with a comparative fit index of 0.78. The FFI-Sp demonstrated excellent internal consistency on the three subscales: pain 0.95; disability 0.96; and activity limitation 0.69, the subscale that scored lowest. The correlation between the FFI-Sp and the other questionnaires was high to moderate. Conclusions The Spanish version of the Foot Function Index (FFI-Sp) is a tool that is a valid and reliable tool with a very good internal consistency for use in the assessment of pain, disability and limitation of the function of the foot, for use both in clinic and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The evaluation of the hand function is an essential element within the clinical practice. The usual assessments are focus on the ability to perform activities of daily life. The inclusion of instruments to measure kinematic variables provides a new approach to the assessment. Inertial sensors adapted to the hand could be used as a complementary instrument to the traditional assessment. Material: clinimetric assessment (Upper Limb Functional Index, Quick Dash), antrophometric variables (eight and weight), dynamometry (palm preasure) was taken. Functional analysis was made with Acceleglove system for the right hand and computer system. The glove has six acceleration sensor, one on each finger and another one on the reverse palm. Method Analytic, transversal approach. Ten healthy subject made six task on evaluation table (tripod pinch, lateral pinch and tip pinch, extension grip, spherical grip and power grip). Each task was made and measure three times, the second one was analyze for the results section. A Matlab script was created for the analysis of each movement and detection phase based on module vector. Results The module acceleration vector offers useful information of the hand function. The data analysis obtained during the performance of functional gestures allows to identify five different phases within the movement, three static phase and tow dynamic, each module vector was allied to one task. Conclusion Module vector variables could be used for the analysis of the different task made by the hand. Inertial sensor could be use as a complement for the traditional assessment system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion weighted magnetic resonance (MR) imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of 6 directions, second-order tensors can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve crossing fiber tracts. Recently, a number of high-angular resolution schemes with greater than 6 gradient directions have been employed to address this issue. In this paper, we introduce the Tensor Distribution Function (TDF), a probability function defined on the space of symmetric positive definite matrices. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the diffusion orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. With High-angular resolution diffusion imaging (HARDI) and the tensor distribution function (TDF), one can reconstruct multiple underlying fibers per voxel and their individual anisotropy measures by representing the diffusion profile as a probabilistic mixture of tensors. We found that FA, when compared with TDF-derived anisotropy measures, correlates poorly with individual fiber anisotropy, and may sub-optimally detect disease processes that affect myelination. By contrast, mean diffusivity (MD) as defined in standard DTI appears to be more accurate. Overall, we argue that novel measures derived from the TDF approach may yield more sensitive and accurate information than DTI-derived measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-angular resolution diffusion imaging (HARDI) can reconstruct fiber pathways in the brain with extraordinary detail, identifying anatomical features and connections not seen with conventional MRI. HARDI overcomes several limitations of standard diffusion tensor imaging, which fails to model diffusion correctly in regions where fibers cross or mix. As HARDI can accurately resolve sharp signal peaks in angular space where fibers cross, we studied how many gradients are required in practice to compute accurate orientation density functions, to better understand the tradeoff between longer scanning times and more angular precision. We computed orientation density functions analytically from tensor distribution functions (TDFs) which model the HARDI signal at each point as a unit-mass probability density on the 6D manifold of symmetric positive definite tensors. In simulated two-fiber systems with varying Rician noise, we assessed how many diffusionsensitized gradients were sufficient to (1) accurately resolve the diffusion profile, and (2) measure the exponential isotropy (EI), a TDF-derived measure of fiber integrity that exploits the full multidirectional HARDI signal. At lower SNR, the reconstruction accuracy, measured using the Kullback-Leibler divergence, rapidly increased with additional gradients, and EI estimation accuracy plateaued at around 70 gradients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We demonstrate a geometrically inspired technique for computing Evans functions for the linearised operators about travelling waves. Using the examples of the F-KPP equation and a Keller–Segel model of bacterial chemotaxis, we produce an Evans function which is computable through several orders of magnitude in the spectral parameter and show how such a function can naturally be extended into the continuous spectrum. In both examples, we use this function to numerically verify the absence of eigenvalues in a large region of the right half of the spectral plane. We also include a new proof of spectral stability in the appropriate weighted space of travelling waves of speed c≥sqrt(2δ) in the F-KPP equation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coloured foliage due to anthocyanin pigments (bronze/red/black) is an attractive trait that is often lacking in many bedding, ornamental and horticultural plants. Apples (Malus × domestica) containing an allelic variant of the anthocyanin regulator, Md-MYB10R6, are highly pigmented throughout the plant, due to autoregulation by MYB10 upon its own promoter. We investigated whether Md-MYB10R6 from apple is capable of functioning within the heterologous host Petunia hybrida to generate plants with novel pigmentation patterns. The Md-MYB10R6 transgene (MYB10–R6pro:MYB10:MYB10term) activated anthocyanin synthesis when transiently expressed in Antirrhinumroseadorsea petals and petunia leaf discs. Stable transgenic petunias containing Md-MYB10R6 lacked foliar pigmentation but had coloured flowers, complementing the an2 phenotype of ‘Mitchell’ petunia. The absence of foliar pigmentation was due to the failure of the Md-MYB10R6 gene to self-activate in vegetative tissues, suggesting that additional protein partners are required for Md-MYB10 to activate target genes in this heterologous system. In petunia flowers, where endogenous components including MYB-bHLH-WDR (MBW) proteins were present, expression of the Md-MYB10R6 promoter was initiated, allowing auto-regulation to occur and activating anthocyanin production. Md-MYB10 is capable of operating within the petunia MBW gene regulation network that controls the expression of the anthocyanin biosynthesis genes, AN1 (bHLH) and MYBx (R3-MYB repressor) in petals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Glutamine is conditionally essential in cancer cells, being utilized as a carbon and nitrogen source for macromolecule production, as well as for anaplerotic reactions fuelling the tricarboxylic acid (TCA) cycle. In this study, we demonstrated that the glutamine transporter ASCT2 (SLC1A5) is highly expressed in prostate cancer patient samples. Using LNCaP and PC-3 prostate cancer cell lines, we showed that chemical or shRNA-mediated inhibition of ASCT2 function in vitro decreases glutamine uptake, cell cycle progression through E2F transcription factors, mTORC1 pathway activation and cell growth. Chemical inhibition also reduces basal oxygen consumption and fatty acid synthesis, showing that downstream metabolic function is reliant on ASCT2-mediated glutamine uptake. Furthermore, shRNA knockdown of ASCT2 in PC-3 cell xenografts significantly inhibits tumour growth and metastasis in vivo, associated with the down-regulation of E2F cell cycle pathway proteins. In conclusion, ASCT2-mediated glutamine uptake is essential for multiple pathways regulating the cell cycle and cell growth, and is therefore a putative therapeutic target in prostate cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion weighted magnetic resonance imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of six directions, second-order tensors (represented by three-by-three positive definite matrices) can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve more complicated white matter configurations, e.g., crossing fiber tracts. Recently, a number of high-angular resolution schemes with more than six gradient directions have been employed to address this issue. In this article, we introduce the tensor distribution function (TDF), a probability function defined on the space of symmetric positive definite matrices. Using the calculus of variations, we solve the TDF that optimally describes the observed data. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function. Moreover, a tensor orientation distribution function (TOD) may also be derived from the TDF, allowing for the estimation of principal fiber directions and their corresponding eigenvalues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.