Tag: 上海东泰大厦

Posted in wlryommylmeo

Analysis: Baines vote lowers the bar for the Hall

first_imgBy Dave Sheinin | Washington PostLAS VEGAS – Harold Baines was a fine baseball player who played for a very long time – a right fielder-turned-designated hitter who earned six all-star appearances, led the American League in slugging in 1984 and amassed 384 home runs and 2,866 hits over 22 seasons.If you’re into advanced analytics, his wins above replacement (Baseball-Reference version) of 38.7 ranks tied for 545th all-time and puts him in the same neighborhood as contemporaries such as Paul …last_img read more

Continue Reading Analysis: Baines vote lowers the bar for the Hall
Posted in sktvgmimqgql

Not Lamarck Again

first_imgRemember Lamarck?  He was the pre-Darwin evolutionist whose theories we were all taught were overthrown by Darwin’s superior theory of natural selection.  Lamarck’s theory of “inheritance of acquired characteristics” was shown to be demonstrably false by the dramatic experiments of Weismann, right?  It was never really so clear-cut as that, as evolutionary historians know, but that’s been the common understanding.  This week, Nature printed an “Insight Perspectives” article about epigenetics (“above genetics”) that, while not referring to Lamarck by name, discussed “acquired” traits that could be inherited by “non-Mendelian” methods.  Its author, Arturas Petronis,1 even spoke of the growing realization of the importance of epigenetics as a new “unifying principle” and a “paradigm shift” in the style of Thomas Kuhn.    For a long time since the structure of DNA was elucidated, the “central dogma” of genetics has been that DNA is the master controller of inheritance.  Information flows from DNA to proteins, and that dictates the phenotype (the outward form of the organism).  In recent decades, the effects of environmental factors onto the genome has become a growing area of research.  Proteins are able to “tag” the histone proteins onto which genes are wound, affecting which genes are expressed or repressed.  Some of these epigenetic tags can be inherited.  Like most dogmas, the central dogma has been an impediment to new ways of scientific thinking, Petronis claims:The nature-versus-nurture debate was one of the most important themes of biomedical science in the twentieth century.  Researchers resolved it by conceding that both factors have a crucial role and that phenotypes result from the actions and interactions of both, which often change over time.  Most ‘normal’ phenotypes and disease phenotypes show some degree of heritability, a finding that formed the basis for a series of molecular studies of genes and their DNA sequences.  In parallel to such genetic strategies, thousands of epidemiological studies have been carried out to identify environmental factors that contribute to phenotypes.  In this article, I consider complex, non-Mendelian, traits and diseases, and review the complexities of investigating their aetiology by using traditional – epidemiological and genetic – approaches.  I then offer an epigenetic interpretation that cuts through several of the Gordian knots that are impeding progress in these aetiological studies.It has been very difficult to assign cause-and-effect relationships from environmental factors to traits.  “Even strong associations between an environmental factor and a disease do not necessarily prove that the environmental factor has caused the disease,” he said.  It is even harder to establish environmental factors to inherited traits, he continued.  Even a term like heritability can be hard to nail down when talking specifics.  Multiple genes become involved, and statistical likelihoods.  Nevertheless, traits do become established in populations.  For instance, an article on Live Science shows that Tibetans have inherited a trait for hemoglobin that allows them to survive at high altitude.  Petronis asks for breaking the gene-centric paradigm: “I argue that taking an epigenetic perspective allows a different interpretation of the irregularities, complexities and controversies of traditional environmental and genetic studies.”    He gave some examples of how acquired traits and environmental effects can influence epigenetic tags that are heritable.  There is no longer a clear black-and-white distinction between the views of Darwin and Lamarck (neither of whom were mentioned in Petronis’s essay); the situation is now much more complex:In the domain of epigenetics, the line between ‘inherited’ and ‘acquired’ is fuzzy.  Stable epigenetic ‘nature’ merges fluidly with plastic epigenetic ‘nurture’.  The ratio between inherited and acquired epigenetic influences can vary considerably depending on species, tissue, age, sex, environmental exposure and stochastic epigenetic events, all of which are consistent with empirical observations that heritability is dynamic and not static.  Another close link between heritable factors and environmental factors in epigenetic regulation is the observation that exposure to certain environments has effects that, in some cases, are transmitted epigenetically for several generations.In his conclusion, he said that this new perspective has all the trappings of what Thomas Kuhn called a paradigm shift: “handling the same bundle of data as before, but placing them in a new system of relations with one another by giving them a different framework.”  It might explain things like sexual dimorphism, parental origin effects, remissions and relapses, intergenerational disease instances, decline of symptoms with age, and other things – questions that an old paradigm would not find interesting, but a new one would.  “The considerable theoretical and experimental potential of an epigenetic perspective makes it a strong alternative to the existing research into complex, non-Mendelian, genetics and biology.” he said.  “Although the existence of competing theories may create some discomfort, it can also catalyse discoveries and is indicative of a mature scientific field.”  Human genetics is not a closed book.    Oh, and what would this new paradigm mean for evolutionary theory?  Glad you asked.  Of all things, Petronis recalled an old quote by Hugo de Vries sometimes paraded with glee by creationists.  But by recalling this quote, he left the reader hanging.  In the new paradigm, what is the explanation for the arrival of the fittest? All of the ideas that I have discussed here are highly relevant to the understanding of the fundamental principles of evolution.  ‘Soft’, epigenetic, inheritance can have a key role in adaptation to environmental changes and can endure for more than a generation.  Phenotypic plasticity might stem mainly from the ability of epigenetic genotype (or epigenotype) – rather than genotype – to produce different phenotypes in different environments.  Heritable epigenetic variation could explain the faster-than-expected adaptation to environmental change that is often observed in natural populations.  In addition, the large intra-individual epigenetic variation in the germ line may shed new light on the problem presented by one of the first geneticists, Hugo De Vries, more than a century ago, in his book Species and Varieties: Their Origin by Mutation, when he wrote “Natural selection may explain the survival of the fittest, but it cannot explain the arrival of the fittest.”Petronis had nothing further to say about fitness or its arrival.  Furthermore, despite the title of his paper, “Epigenetics as a unifying principle in the aetiology of complex traits and diseases,” he gave no description of how any specific complex trait might arise by genetics, by epigenetics, or by any combination of the two.  He only said that a new paradigm shift might “shed light” on the problem presented by Hugo De Vries a century ago.1.  Arturas Petronis, “Epigenetics as a unifying principle in the aetiology of complex traits and diseases,” Nature 465, pp 721-727, 10 June 2010, doi:10.1038/nature09230.That Nature would let in the ghost of Lamarck is a sign of their desperation with Darwin.  So here we are a century after Hugo, waiting for some light.  Petronis doesn’t have any.  Hugo didn’t have any.  Darwin didn’t have any.  Lamarck didn’t have any.  We’ve been sitting in the dark an awful long time listening to this crowd promise that some day somebody will “shed light on evolution.”  Would you spare a dime for their paradigm?  Don’t buy their promissory notes; not even your great-great-grandkids can expect to collect.(Visited 19 times, 1 visits today)FacebookTwitterPinterestSave分享0last_img read more

Continue Reading Not Lamarck Again
Posted in vtuctqxvrymq

How Can Evolutionists Judge Morality?

first_imgWhen trying to account for the “evolution of religion and morality,” Darwinians cut off their own feet.Today is the International Day of Prayer for the Persecuted Church. While many persecutors these days claim a religious motivation, it’s worthwhile to consider worldviews that have, in terms of sheer numbers, been responsible for the worst persecutions of Christians in history. It’s worthwhile, because those worldviews are still prevalent in 2017.The Evolution of Religion, or Vice VersaThe latest pretentious academic trying to explain the evolution of religion fails, once again, to see the inherent illogic of his position. At The Conversation, Dimitris Xygalatas, Assistant Professor in Anthropology at University of Connecticut, attempts to answer the question, “Are religious people more moral?” First, he admits that most people look askance at atheists:Survey data show that Americans are less trusting of atheists than of any other social group. For most politicians, going to church is often the best way to garner votes, and coming out as an unbeliever could well be political suicide. After all, there are no open atheists in the U.S. Congress. The only known religiously unaffiliated representative describes herself as “none,” but still denies being an atheist.So, where does such extreme prejudice come from? And what is the actual evidence on the relationship between religion and morality?By merely asking the question “Are religious people more moral?” however, Xygalatas sets himself up as someone who can judge degrees (“more” vs “less”) of morality. He purports to know what “prejudice” is, having attributed it to people in his prior paragraph. Further, he assesses himself as a judge of evidence.That would be fine if he accepted the Biblical view that humans are created in the image of God with a conscience that presupposes knowledge of universal standards of good and evil. But the article makes clear that Xygalatas believes religion evolved in social groups by natural selection. He speaks of the “co-evolution of God and society” – even making God a product of evolution! In the last paragraph, Xygalatas comes to the defense of the poor, forlorn, picked-on atheists:In those societies [i.e., early less-evolved societies], a sincere belief in a punishing supernatural watcher was the best guarantee of moral behavior, providing a public signal of compliance with social norms.Today we have other ways of policing morality, but this evolutionary heritage is still with us. Although statistics show that atheists commit fewer crimes than average, the widespread prejudice against them, as highlighted by our study, reflects intuitions that have been forged through centuries and might be hard to overcome.From this we can assume Xygalatas is at least an atheist sympathizer, and probably an atheist himself. Religious people only have “intuitions” but atheists (like himself and fellow evolutionists) have evidence, reason and knowledge. Again we see the kind of snooty elitism hatched in the academy (see Yoda Complex in the Darwin Dictionary).Well, we can ask, is it immoral to be illogical? It is if you have a conscience, and a firm foundation for morals. On what foundation does Xygalatas stand? On natural selection? That’s a foundation of shifting sand where morality is based on breeding success. Exact opposite behaviors are justifiable in Darwinism if they produce offspring: flight or fight, fast or slow, camouflage or showmanship. This carries over into behaviors and thoughts as well. Xygalatas can only claim moral success by Darwinian standards if, by writing this article, he passes on his genes to the next generation. The content of his thoughts is irrelevant. His judgments of degrees of morality are, therefore, vacuous. He’s blowing hot air in hopes of having more sex. That’s the “evolution of morality.”By his own theory, he should get religion. It seems to work for the majority of people, who distrust atheists. They must be the fittest. By insinuating that atheists are more moral, Xygalatas has cut off his own feet and committed himself to the ranks of the unfit.But if he decides to get religion to get with the fitness program, which religion? They’re all the same to a Darwinian. They’re all products of natural selection working on populations. He might as well pick one that gives him the most sexual pleasure, like Baal worshipers did in ancient times as they committed holy prostitution under sacred trees.The only way he can justify his wish to judge other human beings with intellectual content, though, is to pick a worldview that has a foundation of truth and unchanging morality. There are not too many of those, since they require an all-powerful, all-knowing, righteous God who has Authority as Creator of the universe. The God of the Bible matches that description, but He does not take lightly to false faith. Xygalatas had better not try fakery if he decides to get fitness as a Christian. He has to really mean it. But to really mean it, he would have to abandon Darwinian evolution.No matter which way you slice it, Xygalatas has propounded a view that collapses on itself. His intellectual sand castle degenerates into particles of sand in the sandpile on which it stands.The Evolution of RevengeThe website The Conversation seems to give voice to a predominance of shallow-thinking academics. Another example is this article by Stephen Fineman of the University of Bath, titled, “Wanting revenge is only natural – here’s why.” The basic idea is that revenge served an evolutionary purpose for our hominid ancestors, and that’s why we are stuck with it now.As I explore in my new book, by sensationalising and deprecating the idea of revenge itself, we may forget that some forms of revenge can work well and serve a crucial purpose.Revenge systems have been around for a very long time, with our primate cousins leading the way. Chimpanzees and macaques will freely inflict punishments on strangers and rule breakers and, with their excellent memories, cannily postpone retaliation until a suitable opportunity arises.If that is true, then we are certainly doing the right thing (according to his worldview) to exact revenge on Fineman by attacking his views.* Fineman describes some pretty awful cases of violence and revenge, but says, “Who can blame them?” Did it occur to him that blame is a moral term? Did it occur to him that purpose is a term of design? By use of these words that are not found in the Darwin Dictionary, Fineman has revealed that he has a conscience. He knows deep down that cases of gratuitous violence are morally wrong.*If Fineman were to exact revenge by responding to our views, we could just accuse him of not really meaning anything he says, because his worldview knows nothing of truth. His selfish genes are manipulating him to engage in behaviors (such as writing articles for The Conversation) in order to get more sex.The Legacy of Atheist MoralityIf Fineman or Xygalatas think that evolutionists have some kind of edge on morality due to the explanatory power of Darwinism, we would like them to take a serious look at some very, very disturbing photos. Last month was the 100th anniversary of “Red October,” the communist revolution led by Lenin that established the Soviet state. The UK Daily Mail has posted “Victims of the red revolution: The haunting faces of prisoners worked to death in Stalin’s slave camps emerge as 100th anniversary of 1917 Bolshevik takeover approaches.” Please take a moment to read the captions, and look into the faces of the prisoners. Imagine yourself among them, having been kidnapped in the middle of the night by secret police and carted off to some wretched place, most likely freezing cold, where you were sentenced to work for years, or to the death. The photo gallery is a mere peephole into a vast network of crimes against humanity justified in the name of atheistic communism.The USSR was but one of the many atheistic regimes of the 20th century (China, Vietnam, Cambodia, Cuba, North Korea among the others). Whatever one wants to think idealistically about communist morality, here is what actually happened in the USSR:This year marks 100 years since the 1917 Russian Revolution that led Lenin to take control of the Soviet UnionWhen Lenin died in 1924, Joseph Stalin rose to power and became the Soviet Union’s authoritarian leaderBetween 1929 and Stalin’s death in 1953, 18million people were transported to Soviet slave labour campsLabourers in the prisons worked up to 14 hours a day on huge projects, including the White Sea-Baltic CanalBy the time the last Soviet gulag closed, millions of people had died from exhaustion, starvation and murderOn WND, John Stossel reminds readers of “Communism’s bloody legacy: 100 years and 100 million deaths.” Communism was “one of the worst mistakes ever made,” he says, describing the truckloads of bodies and labor camps where the very ‘proletarians’ who had cheered the revolution were worked to death in extremely inhumane conditions. Soviet dictators prided themselves on their godlessness, murdering pastors and turning churches into museums of atheism. Bibles were absolutely forbidden. The only ‘scientific’ view of biology allowed in Soviet schools was—you guessed it—Darwinian evolution.This week, on November 7-9, the Victims of Communism will hold a centennial commemoration for the Russian Revolution. It will not be a happy celebration or fun party. The objective will be “to honor the memory of the more than 100 million victims of communism, to celebrate liberty where it has triumphed, and to further our pursuit of a world free from communism.” One outspoken critic of communism, Dr Sebastian Gorka, whose parents escaped from communist Hungary, reminded Fox & Friends passionately that communism killed 100 million people. Responding to a recent poll that showed 44% of millennials think socialism and communism are good systems, Gorka said that the fault is with our education system, subverted by leftists, that has denied the truth to our students. And like it was in Soviet Russia, any attempt in America to grant students an opportunity to hear criticisms of Darwinian theory in public school science classes is met with holy horror and condemnation.Lest we forget, a wise Teacher once said, “By their fruits you shall know them” (Matthew 7:20). An evil tree brings forth evil fruit. That’s because good and evil are objective realities, not products of evolution.(Visited 483 times, 1 visits today)FacebookTwitterPinterestSave分享0last_img read more

Continue Reading How Can Evolutionists Judge Morality?
Posted in qygofqspskjp

How Long Does It Take To Build A Native Mobile App? [Infographic]

first_imgThe last several years have seen an explosion in mobile applications. By the end of 2013, both Android’s Google Play and the Apple iOS App Store will be hosting a million apps – and we have only seen minor signs of slowing growth.Where the heck are all these apps coming from? Thousands upon thousands of developers are working hard to pump out games, social networks, utility and productivity apps, news readers… if you can dream it, someone is building an app for it.So, how much time and effort is going into feeding this beast? Exactly how long does it take to build a quality native mobile app (not a mobile Web, HTML5 app)? Boston-based Backend-as-a-Service (BaaS) mobile-cloud-platform vendor Kinvey set out to answer just that question.More Than 4 Months!?In a survey of 100 native mobile developers, Kinvey determined that creating a fully functional and polished app takes a team about 18 weeks from start to finish. That includes both front-end design and user interface as well as back-end integration like push notifications, user management and authentication, caching and sharing through social channels.I know what many app developers are thinking when they hear that: “18 weeks?! Who the hell are these turtle-slow developers?” On the other hand, enterprise developers are probably saying: “18 weeks?! We are only halfway through by that point.”Given the sheer volume of apps published on a monthly basis (the App Store averaged 641 new apps a day from September 2012 to January 2013), taking 4.5 months for one app does seem like a long development cycle. But as many smartphone users already know, not all of those apps (probably not the vast majority of them) are any good.Some apps are naturally easier to make than others, like reverse-engineered “copycat” apps or feature functions like Android wallpaper apps. For instance, it was rumored that it took Facebook engineers only a matter of days to clone Snapchat with its similar Poke app.Android vs. iOS: Which Takes Longer?What if you are developing specifically for iOS or Android? Does one take longer than another? The answer used to be a definite Yes, Android took longer because of the fragmentation issues of developing an app for a wide variety of smartphones.That is not quite as true as it used to be though. Google spent a good portion of 2012 updating and streamlining the Android Software Developers Kit (SDK) to better handle varying screen sizes, pixel densities and operating system versions. Many improvements came to Android app development processes with both the Ice Cream Sandwich and Jelly Bean releases.“Assuming equal skill level on the part of the developer, it shouldn’t take longer to build an app on one platform or the other,” said  Joe Chernov, VP of marketing on behalf of Kinvey’s engineering team. “In the fairly recent past, Android took longer because of the complexity of multiple device form factors. However Android’s vastly improved developer tools and SDK has removed that complexity. Now a developer can use a designer tool to instantly see what the UI will look like on multiple devices. Yet while building the app might take the same amount of time on each platform, what does take longer is the approval process. For Android, approval takes hours (and there’s a self-service option that removes approval entirely), while Apple can take weeks.”The infographic below is the result of Kinvey’s developer survey, conducted in partnership with AYTM (Ask Your Target Market). The infographic itself was created by Visual.ly. Developers were asked 12 questions on how long it would take to perform a variety of functions such as integrating server-side data storage or design work. Respondents varied from different aspects of Web and mobile development, with 30 identifying themselves as mobile Web (but not native) developers, 27 as strictly native developers and 43 as enterprise-level developers. The 18-week-development-cycle conclusion was reached by adding up how long developers said it would take to perform certain actions. The data seems to show a standard deviation of about two weeks in either direction.So, how long will it take you to develop your native mobile app, start to finish? Really, there is no easy answer to that question. Dave Bisceglia, founder and CEO of Boston-based iOS game development studio The Tap Lab sums it up nicely:“The less exciting but entirely true answer is, ‘It depends,’ ” Bisceglia said. “I’ve seen very talented teams crank out high-quality apps in just a few weeks. However, the demand for higher production quality in apps has certainly risen in recent years. Accordingly, app dev cycles have extended and we’re seeing folks spend anywhere from 6 to 12 months on more complex projects.”Note: Click the infographic to see the full-size version.Top image courtesy of Shutterstock. Tags:#app development Why IoT Apps are Eating Device Interfaces Related Posts What it Takes to Build a Highly Secure FinTech …center_img dan rowinski Role of Mobile App Analytics In-App Engagement The Rise and Rise of Mobile Payment Technologylast_img read more

Continue Reading How Long Does It Take To Build A Native Mobile App? [Infographic]