980 resultados para Almost common value auctions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Л. И. Каранджулов, Н. Д. Сиракова - В работата се прилага методът на Поанкаре за решаване на почти регулярни нелинейни гранични задачи при общи гранични условия. Предполага се, че диференциалната система съдържа сингулярна функция по отношение на малкия параметър. При определени условия се доказва асимптотичност на решението на поставената задача.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The objective of this paper is to address the question whether and how firms can follow a standard management process to cope with emerging corporate social responsibility (CSR) challenges? Both researchers and practitioners have paid increasing attention to the question because of the rapidly evolving CSR expectations of stakeholders and the limited diffusion of CSR standardization. The question was addressed by developing a theoretical framework to explain how dynamic capabilities can contribute to effective CSR management. Design/methodology/approach – Based on 64 world-leading companies’ contemporary CSR reports, we carried out a large-scale content analysis to identify and examine the common organizational processes involved in CSR management and the dynamic capabilities underpinning those management processes. Findings – Drawing on the dynamic capabilities perspective, we demonstrate how the deployment of three dynamic capabilities for CSR management, namely, scanning, sensing and reconfiguration capabilities can help firms to meet emerging CSR requirements by following a set of common management processes. The findings demonstrate that what is more important in CSR standardization is the identification and development of the underlying dynamic capabilities and the related organizational processes and routines, rather than the detailed operational activities. Originality/value - Our study is an early attempt to examine the fundamental organizational capabilities and processes involved in CSR management from the dynamic capabilities perspective. Our research findings contribute to CSR standardization literature by providing a new theoretical perspective to better understand the capabilities enabling common CSR management processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wikis are quickly emerging as a new corporate medium for communication and collaboration. They allow dispersed groups of collaborators to asynchronously engage in persistent conversations, the result of which is stored on a common server as a single, shared truth. To gauge the enterprise value of wikis, the authors draw on Media Choice Theories (MCTs) as an evaluation framework. MCTs reveal core capabilities of communication media and their fit with the communication task. Based on the evaluation, the authors argue that wikis are equivalent or superior to existing asynchronous communication media in key characteristics. Additionally argued is the notion that wiki technology challenges some of the held beliefs of existing media choice theories, as wikis introduce media characteristics not previously envisioned. The authors thus predict a promising future for wiki use in enterprises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tudásmenedzsment-rendszerek kiépítése és működtetése egyre népszerűbb vállalati cél. A legnagyobb igyekezet ellenére is kudarccal végződhet egy ilyen változás megvalósítása, ha a szükséges feltételek nélkül próbálkozunk ezzel a beavatkozással a szervezet életében. Az egyik legfontosabb előfeltétel a bizalomra, közös tanulásra, fejlődésre épülő, nyitott szervezeti kultúra, mely a tanulószervezeti jellemzőkkel írható le. A szerzők kutatásukban arra voltak kíváncsiak, milyen elképzeléseik, vágyaik vannak a felsőoktatásban oktató kollégáknak az ideálisnak nevezett szervezeti kultúráról. Ezeket az elképzeléseket egy külső tanácsadó cég által végzett kérdőíves felmérésen alapuló vizsgálat segítségével tudták meg, melynek kiértékelésére a circumplex-módszer szolgált. Az eredményeket összevetették a tanulószervezeti jellemzőkkel, vizsgálva azt a hipotézist, miszerint az oktató kollégák tudat alatt is olyan ideális szervezetet képzelnek el, mely a tanulószervezeti kultúra jellemzőivel azonos. _________ To create and to operate a knowledge management system is becoming a more and more popular target of the companies. Realizing the changes above can result in a failure – in spite of the biggest will – if organizations lack certain prerequisites which are necessary in companies’ lives. One of the most important prerequisites is organizational culture which can be characterized by confidence, common learning, development and open atmosphere. This is called a learning organizational culture. In their research the authors would have liked to know what kind of dreams the colleagues have about their own organizational culture. They achieve these results from an investigation with questionnaires which were realized by an advisory team. To evaluating the results of this investigation circumplex method was used. These results were compared with the characteristics of learning organization to confirm our hypothesis. According to this idea colleagues have the same images about their successful organization as the characteristics of learning organizations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heterogeneity of labour and its implications for the Marxian theory of value has been one of the most controversial issues in the literature of the Marxist political economy. The adoption of Marx's conjecture about a uniform rate of surplus value leads to a simultaneous determination of the values of common and labour commodities of different types and the uniform rate of surplus value. Determination of these variables can be formally represented as a parametric cigenvalue problem. Morishima's and Bródy's earlier results are analysed and given new interpretations in the light of the suggested procedure. The main questions are addressed in a more general context too. The analysis is extended to the problem of segmented labour market, as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This case study examines the factors that shaped the identity and landscape of a small island-urban-village between the north and south forks of the Middle River and north of an urban area in Broward County, Florida. The purpose of the study is to understand how Wilton Manors was transformed from a “whites only” enclave to the contemporary upscale, diverse, and third gayest city in the U.S. by positing that a dichotomy for urban places exists between their exchange value as seen by Logan and Molotch and the use value produced through everyday activity according to Lefebvre. Qualitative methods were used to gather evidence for reaching conclusions about the relationship among the worldview of residents, the tension between exchange value and use value in the restructuration of the city, and the transformation of Wilton Manors at the end of the 1990s. Semi-structured, in-depth interviews were conducted with 21 contemporary participants. In addition, thirteen taped CDs of selected members of founding families, previously taped in the 1970s, were analyzed using a grounded theory approach. My findings indicate that Wilton Manors’ residents share a common worldview which incorporates social inclusion as a use value, and individual agency in the community. This shared worldview can be traced to selected city pioneers whose civic mindedness helped shape city identity and laid the foundation for future restructuration. Currently, residents’ quality of life reflected in the city’s use value is more significant than exchange value as a primary force in the decisions that are made about the city’s development. With innovative ideas, buildings emulating the new urban mixed-use design, and a reputation as the third gayest city in the United States, Wilton Manors reflects a worldview where residents protect use value as primary over market value in the decisions they make that shape their city but not without contestation.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the discussion - Ethics, Value Systems And The Professionalization Of Hoteliers by K. Michael Haywood, Associate Professor, School of Hotel and Food Administration, University of Guelph, Haywood initially presents: “Hoteliers and executives in other service industries should realize that the foundation of success in their businesses is based upon personal and corporate value systems and steady commitment to excellence. The author illustrates how ethical issues and manager morality are linked to, and shaped by the values of executives and the organization, and how improved professionalism can only be achieved through the adoption of a value system that rewards contributions rather than the mere attainment of results.” The bottom line of this discussion is, how does the hotel industry reconcile its behavior with that of public perception? “The time has come for hoteliers to examine their own standards of ethics, value systems, and professionalism,” Haywood says. And it is ethics that are at the center of this issue; Haywood holds that component in an estimable position. “Hoteliers must become value-driven,” advises Haywood. “They must be committed to excellence both in actualizing their best potentialities and in excelling in all they do. In other words, the professionalization of the hotelier can be achieved through a high degree of self-control, internalized values, codes of ethics, and related socialization processes,” he expands. “Serious ethical issues exist for hoteliers as well as for many business people and professionals in positions of responsibility,” Haywood alludes in defining some inter-industry problems. “The acceptance of kickbacks and gifts from suppliers, the hiding of income from taxation authorities, the lack of interest in installing and maintaining proper safety and security systems, and the raiding of competitors' staffs are common practices,” he offers, with the reasoning that if these problems can occur within ranks, then there is going to be a negative backlash in the public/client arena as well. Haywood divides the key principles of his thesis statement - ethics, value systems, and professionalism – into specific elements, and then continues to broaden the scope of each element. Promotion, product/service, and pricing are additional key components in Haywood’s discussion, and he addresses each with verve and vitality. Haywood references the four character types - craftsmen, jungle fighters, company men, and gamesmen – via a citation to Michael Maccoby, in the portion of the discussion dedicated to morality and success. Haywood closes with a series of questions derived from Lawrence Miller's American Spirit, Visions of a New Corporate Culture, each question designed to focus, shape, and organize management's attention to the values that Miller sets forth in his piece.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seascape ecology provides a useful framework from which to understand the processes governing spatial variability in ecological patterns. Seascape context, or the composition and pattern of habitat surrounding a focal patch, has the potential to impact resource availability, predator-prey interactions, and connectivity with other habitats. For my dissertation research, I combined a variety of approaches to examine how habitat quality for fishes is influenced by a diverse range of seascape factors in sub-tropical, back-reef ecosystems. In the first part of my dissertation, I examined how seascape context can affect reef fish communities on an experimental array of artificial reefs created in various seascape contexts in Abaco, Bahamas. I found that the amount of seagrass at large spatial scales was an important predictor of community assembly on these reefs. Additionally, seascape context had differing effects on various aspects of habitat quality for the most common reef species, White grunt Haemulon plumierii. The amount of seagrass at large spatial scales had positive effects on fish abundance and secondary production, but not on metrics of condition and growth. The second part of my dissertation focused on how foraging conditions for fish varied across a linear seascape gradient in the Loxahatchee River estuary in Florida, USA. Gray snapper, Lutjanus griseus, traded food quality for quantity along this estuarine gradient, maintaining similar growth rates and condition among sites. Additional work focused on identifying major energy flow pathways to two consumers in oyster-reef food webs in the Loxahatchee. Algal and microphytobenthos resource pools supported most of the production to these consumers, and body size for one of the consumers mediated food web linkages with surrounding mangrove habitats. All of these studies examined a different facet of the importance of seascape context in governing ecological processes occurring in focal habitats and underscore the role of connectivity among habitats in back-reef systems. The results suggest that management approaches consider the surrounding seascape when prioritizing areas for conservation or attempting to understand the impacts of seascape change on focal habitat patches. For this reason, spatially-based management approaches are recommended to most effectively manage back-reef systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service supply chain (SSC) has attracted more and more attention from academia and industry. Although there exists extensive product-based supply chain management models and methods, they are not applicable to the SSC as the differences between service and product. Besides, the existing supply chain management models and methods possess some common deficiencies. Because of the above reasons, this paper develops a novel value-oriented model for the management of SSC using the modeling methods of E3-value and Use Case Maps (UCMs). This model can not only resolve the problems of applicability and effectiveness of the existing supply chain management models and methods, but also answer the questions of ‘why the management model is this?’ and ‘how to quantify the potential profitability of the supply chains?’. Meanwhile, the service business processes of SSC system can be established using its logic procedure. In addition, the model can also determine the value and benefits distribution of the entire service value chain and optimize the operations management performance of the service supply.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many modern applications fall into the category of "large-scale" statistical problems, in which both the number of observations n and the number of features or parameters p may be large. Many existing methods focus on point estimation, despite the continued relevance of uncertainty quantification in the sciences, where the number of parameters to estimate often exceeds the sample size, despite huge increases in the value of n typically seen in many fields. Thus, the tendency in some areas of industry to dispense with traditional statistical analysis on the basis that "n=all" is of little relevance outside of certain narrow applications. The main result of the Big Data revolution in most fields has instead been to make computation much harder without reducing the importance of uncertainty quantification. Bayesian methods excel at uncertainty quantification, but often scale poorly relative to alternatives. This conflict between the statistical advantages of Bayesian procedures and their substantial computational disadvantages is perhaps the greatest challenge facing modern Bayesian statistics, and is the primary motivation for the work presented here.

Two general strategies for scaling Bayesian inference are considered. The first is the development of methods that lend themselves to faster computation, and the second is design and characterization of computational algorithms that scale better in n or p. In the first instance, the focus is on joint inference outside of the standard problem of multivariate continuous data that has been a major focus of previous theoretical work in this area. In the second area, we pursue strategies for improving the speed of Markov chain Monte Carlo algorithms, and characterizing their performance in large-scale settings. Throughout, the focus is on rigorous theoretical evaluation combined with empirical demonstrations of performance and concordance with the theory.

One topic we consider is modeling the joint distribution of multivariate categorical data, often summarized in a contingency table. Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. In Chapter 2, we derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions.

Latent class models for the joint distribution of multivariate categorical, such as the PARAFAC decomposition, data play an important role in the analysis of population structure. In this context, the number of latent classes is interpreted as the number of genetically distinct subpopulations of an organism, an important factor in the analysis of evolutionary processes and conservation status. Existing methods focus on point estimates of the number of subpopulations, and lack robust uncertainty quantification. Moreover, whether the number of latent classes in these models is even an identified parameter is an open question. In Chapter 3, we show that when the model is properly specified, the correct number of subpopulations can be recovered almost surely. We then propose an alternative method for estimating the number of latent subpopulations that provides good quantification of uncertainty, and provide a simple procedure for verifying that the proposed method is consistent for the number of subpopulations. The performance of the model in estimating the number of subpopulations and other common population structure inference problems is assessed in simulations and a real data application.

In contingency table analysis, sparse data is frequently encountered for even modest numbers of variables, resulting in non-existence of maximum likelihood estimates. A common solution is to obtain regularized estimates of the parameters of a log-linear model. Bayesian methods provide a coherent approach to regularization, but are often computationally intensive. Conjugate priors ease computational demands, but the conjugate Diaconis--Ylvisaker priors for the parameters of log-linear models do not give rise to closed form credible regions, complicating posterior inference. In Chapter 4 we derive the optimal Gaussian approximation to the posterior for log-linear models with Diaconis--Ylvisaker priors, and provide convergence rate and finite-sample bounds for the Kullback-Leibler divergence between the exact posterior and the optimal Gaussian approximation. We demonstrate empirically in simulations and a real data application that the approximation is highly accurate, even in relatively small samples. The proposed approximation provides a computationally scalable and principled approach to regularized estimation and approximate Bayesian inference for log-linear models.

Another challenging and somewhat non-standard joint modeling problem is inference on tail dependence in stochastic processes. In applications where extreme dependence is of interest, data are almost always time-indexed. Existing methods for inference and modeling in this setting often cluster extreme events or choose window sizes with the goal of preserving temporal information. In Chapter 5, we propose an alternative paradigm for inference on tail dependence in stochastic processes with arbitrary temporal dependence structure in the extremes, based on the idea that the information on strength of tail dependence and the temporal structure in this dependence are both encoded in waiting times between exceedances of high thresholds. We construct a class of time-indexed stochastic processes with tail dependence obtained by endowing the support points in de Haan's spectral representation of max-stable processes with velocities and lifetimes. We extend Smith's model to these max-stable velocity processes and obtain the distribution of waiting times between extreme events at multiple locations. Motivated by this result, a new definition of tail dependence is proposed that is a function of the distribution of waiting times between threshold exceedances, and an inferential framework is constructed for estimating the strength of extremal dependence and quantifying uncertainty in this paradigm. The method is applied to climatological, financial, and electrophysiology data.

The remainder of this thesis focuses on posterior computation by Markov chain Monte Carlo. The Markov Chain Monte Carlo method is the dominant paradigm for posterior computation in Bayesian analysis. It has long been common to control computation time by making approximations to the Markov transition kernel. Comparatively little attention has been paid to convergence and estimation error in these approximating Markov Chains. In Chapter 6, we propose a framework for assessing when to use approximations in MCMC algorithms, and how much error in the transition kernel should be tolerated to obtain optimal estimation performance with respect to a specified loss function and computational budget. The results require only ergodicity of the exact kernel and control of the kernel approximation accuracy. The theoretical framework is applied to approximations based on random subsets of data, low-rank approximations of Gaussian processes, and a novel approximating Markov chain for discrete mixture models.

Data augmentation Gibbs samplers are arguably the most popular class of algorithm for approximately sampling from the posterior distribution for the parameters of generalized linear models. The truncated Normal and Polya-Gamma data augmentation samplers are standard examples for probit and logit links, respectively. Motivated by an important problem in quantitative advertising, in Chapter 7 we consider the application of these algorithms to modeling rare events. We show that when the sample size is large but the observed number of successes is small, these data augmentation samplers mix very slowly, with a spectral gap that converges to zero at a rate at least proportional to the reciprocal of the square root of the sample size up to a log factor. In simulation studies, moderate sample sizes result in high autocorrelations and small effective sample sizes. Similar empirical results are observed for related data augmentation samplers for multinomial logit and probit models. When applied to a real quantitative advertising dataset, the data augmentation samplers mix very poorly. Conversely, Hamiltonian Monte Carlo and a type of independence chain Metropolis algorithm show good mixing on the same dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Waterways have many more ties with society than as a medium for the transportation of goods alone. Waterway systems offer society many kinds of socio-economic value. Waterway authorities responsible for management and (re)development need to optimize the public benefits for the investments made. However, due to the many trade-offs in the system these agencies have multiple options for achieving this goal. Because they can invest resources in a great many different ways, they need a way to calculate the efficiency of the decisions they make. Transaction cost theory, and the analysis that goes with it, has emerged as an important means of justifying efficiency decisions in the economic arena. To improve our understanding of the value-creating and coordination problems for waterway authorities, such a framework is applied to this sector. This paper describes the findings for two cases, which reflect two common multi trade-off situations for waterway (re)development. Our first case study focuses on the Miami River, an urban revitalized waterway. The second case describes the Inner Harbour Navigation Canal in New Orleans, a canal and lock in an industrialized zone, in need of an upgrade to keep pace with market developments. The transaction cost framework appears to be useful in exposing a wide variety of value-creating opportunities and the resistances that come with it. These insights can offer infrastructure managers guidance on how to seize these opportunities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Scottish Court of Session, drawing upon principles of the civil law tradition, as well as arguments concerning broader national, social and cultural interests, reject the concept of copyright at common law - a decision that is in direct conflict with that of Millar v. Taylor (1769). Lord Monboddo provides the dissenting opinion, drawing upon the labour theory of property rights, and argues for a unified approach to the issue in relation to the common law of both England and Scotland.
Drawing upon Scottish Records Office archives the commentary explores the background to, and substance of, the decision. It suggests that, given the nature of the economic threat which the Scottish reprint industry posed to the London book trade, particularly in relation to an increasingly lucrative export market, Hinton undermined much of the value of the decision in Millar. The conflict between Millar and Hinton made it almost inevitable that the question of literary property would soon reach the House of Lords.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present high-speed photometry and high-resolution spectroscopy of the eclipsing post-common-envelope binary QS Virginis (QS Vir). Our Ultraviolet and Visual Echelle Spectrograph (UVES) spectra span multiple orbits over more than a year and reveal the presence of several large prominences passing in front of both the M star and its white dwarf companion, allowing us to triangulate their positions. Despite showing small variations on a time-scale of days, they persist for more than a year and may last decades. One large prominence extends almost three stellar radii from the M star. Roche tomography reveals that the M star is heavily spotted and that these spots are long-lived and in relatively fixed locations, preferentially found on the hemisphere facing the white dwarf. We also determine precise binary and physical parameters for the system. We find that the 14 220 ± 350 K white dwarf is relatively massive, 0.782 ± 0.013 M⊙, and has a radius of 0.010 68 ± 0.000 07 R⊙, consistent with evolutionary models. The tidally distorted M star has a mass of 0.382 ± 0.006 M⊙ and a radius of 0.381 ± 0.003 R⊙, also consistent with evolutionary models. We find that the magnesium absorption line from the white dwarf is broader than expected. This could be due to rotation (implying a spin period of only ˜700 s), or due to a weak (˜100 kG) magnetic field, we favour the latter interpretation. Since the M star's radius is still within its Roche lobe and there is no evidence that it is overinflated, we conclude that QS Vir is most likely a pre-cataclysmic binary just about to become semidetached.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Highly swellable polymer films doped with Ag nanoparticle aggregates (poly-SERS films) have been used to record very high signal:noise ratio, reproducible surface-enhanced (resonance) Raman (SER(R)S) spectra of in situ dried ink lines and their constituent dyes using both 633 and 785 nm excitation. These allowed the chemical origins of differences in the SERRS spectra of different inks to be determined. Initial investigation of pure samples of the 10 most common blue dyes showed that the dyes which had very similar chemical structures such as Patent Blue V and Patent Blue VF (which differ only by a single OH group) gave SERRS spectra in which the only indications that the dye structure had been changed were small differences in peak positions or relative intensities of the bands. SERRS studies of 13 gel pen inks were consistent with this observation. In some cases inks from different types of pens could be distinguished even though they were dominated by a single dye such as Victoria Blue B (Zebra Surari) or Victoria Blue BO (Pilot Acroball) because their predominant dye did not appear in other inks. Conversely, identical spectra were also recorded from different types of pens (Pilot G7, Zebra Z-grip) because they all had the same dominant Brilliant Blue G dye. Finally, some of the inks contained mixtures of dyes which could be separated by TLC and removed from the plate before being analysed with the same poly-SERS films. For example, the Pentel EnerGel ink pen was found to give TLC spots corresponding to Erioglaucine and Brilliant Blue G. Overall, this study has shown that the spectral differences between different inks which are based on chemically similar, but nonetheless distinct dyes, are extremely small, so very close matches between SERRS spectra are required for confident identification. Poly-SERS substrates can routinely provide the very stringent reproducibility and sensitivity levels required. This, coupled with the awareness of the reasons underlying the observed differences between similarly coloured inks allows a more confident assessment of the evidential value of inks SERS and should underpin adoption of this approach as a routine method for the forensic examination of inks.