Thursday, February 12, 2015
Want to change from Windows to Mac Read this

Six myths about Macs in the enterprise
By Tom Kaneshige
Want a Mac for work? Sure you do. Macs are powerful, sleek and super easy to use. Even your companys top executives probably have them.
But chances are you dont have a Mac. "Our stats do not show Apples major uptake in the enterprise market," says Gartner (IT) analyst Mikako Kitagawa. "Apples share in the PC market has been less than 1 percent in the last several years and has not changed."
Companies and IT departments have made all sorts of claims why Macs shouldnt be allowed to enter the enterprise, especially not en masse. Some of their claims make valid points. Others are more myth than reality. The so-called barriers to Macs in the enterprise range from the cost of Macs to ill-prepared IT staff to the lack of user justification.
Here are six concerns that hinder Mac adoption:
Do Macs Cost Too Much?
A recent CIO story, Are Macs Really Cheaper to Manage Than PCs, sparked a heated debate among readers who promptly took sides. Indeed, a Macs price tag is the highest hurdle Macs need to clear for enterprise adoption.
Many CIOs claim that lower support costs offset the premium price for Macs. In fact, Tom Kelly, who wears two hats - CFO and CIO - at Healthcare IP Partners, brought Macs into a Windows-only enterprise a couple of years ago because he saw the potential for Macs to relieve desktop-support management headaches and cut support costs.
Not only do Mac users experience fewer problems, Kelly says, they also take ownership by either troubleshooting technical hiccups themselves or taking their Mac to an Apple Store.
An Enterprise Desktop Alliance survey found that Macs were cheaper in six of seven computer management categories: troubleshooting, help desk calls, system configuration, user training and supporting infrastructure (servers, networks and printer).
Mac naysayers, on the other hand, cite the high cost of Macs coupled with the overhead of having to support two operating systems. One reader writes: "User support cost-savings are eaten up by transition costs: backup, systems management, antivirus, office software, rights management, Excel/Word/PPT macros. All that stuff needs to be changed or implemented redundantly."
The truth lies somewhere in the middle.
Robert Pickering, vice president of information technology at AAA Allied Group, says the upfront cost of a Mac is significant. His standard-issue Hewlett-Packard laptop is around $1,000, whereas a Macbook Pro starts at $2,500 plus additional costly peripherals such as a docking station.
This cost difference means employees must make a compelling case to managers for a Mac--which isnt easy. "People often only look at whats coming out of their initial capital expenditure budget," says Pickering, a self-proclaimed Mac fan since 1984. "Theyre not looking at depreciation or residual value because those are three or four years away."
Yet Macs make up the cost difference during those years, Pickering says. AAA Allied Group has a PC refresh cycle every three years, and a three-year-old Hewlett-Packard laptop is basically worthless. Dim LCD. Crashed drives. Cracked casings.
A three-year-old Macbook Pro, on the other hand, can be sold on eBay or privately to the employee for $1,000. Or a Macbook Pro can be used for another year. "Now were in the same ballpark on the hardware costs," Pickering says.
Will Virtualisation Gobble Up Savings?
On the software side, Pickering saves he saves licensing dollars on Macs because he doesnt buy anti-virus and anti-spyware software for them. With Windows PCs, though, they are must-have software.
Mac-related support issues are also nearly non-existent. "I would like a larger percentage of Macs in the environment because users would be happier, as would my help desk because they wouldnt get the calls," Pickering says. (AAA Allied Group began supporting Macs beyond the marketing department in 2009, and the number of Macs has grown to 8 percent of some 1,000 computers.)
The problem is that Macs often need desktop virtualisation in order to run critical Windows apps, namely Office and Outlook--and this upends much of the Mac savings.
Another reader writes: "Almost all the Macs in my company require VMware, Fusion/Parallels or WinXP with Bootcamp, which means time spent configuring and supporting the PC side of the setup, as well as constant hacks and work-arounds to get features that are a simple setup on the PC to work on a Mac. Add to that no centralised administration with Active Directory, problematic setups with network shares, email quirks and the like, and I would have to say I completely disagree that Macs are cheaper than PCs."
It doesnt make sense to give a Mac to an employee when most of the apps will be running on a virtual machine. "Thats a crutch," Pickeing says. "Its difficult to justify the Mac because you cant save on the licensing. It gets expensive running Windows in virtualisation on top of something else."
Pickering, though, predicts this problem will be short-lived. Employees, he says, often convert to native Mac apps after a couple of months with the exception of Outlook. Mac users dont want to deal with the quirks that come with Entourage, so the last virtualised Windows app is Outlook.
"But the advent of Office 2010, including native Outlook on the Mac, will be game changing," Pickering says. "You wont need desktop virtualisation anymore."
Do You Really Need a Mac?
One of the most common responses to Mac requests is, "Why do you need one?" Its a looming hurdle that discourages many employees from even asking for a Mac.
Some employees really do need Macs to get their job done. Graphics departments need Macs because critical apps such as Adobe Creative Suite simply dont run well on Windows. Web developers need Macs to test code on a variety of browsers; you cant run Safari or Firefox on a Windows machine because the Mac OS cant be virtualised, at least not legally.
At AAA Allied Group, top executives have Macs: the vice president of marketing, vice president of membership, executive vice president of travel. The latter is on the road all the time and carries a Macbook Air for its convenience and computing power. Pickering got a Mac as a condition of his employment. "Execs own budgets, so they can self approve," he says.
What about a Mac for the rest of us? Pickering says executives with Macs can grease the wheels for employees to get Macs. Thats because they appreciate the Macs impact on productivity and are more likely to approve them. Managers with PCs, on the other hand, make Macs a hard sell for employees reporting to them.
Companies competing for talent can also dangle Macs as an incentive. A Silicon Valley law firm brought Macs into the enterprise two years ago because many lawyers wanted PC choice. Today, half of the lawyers use a Mac. "Theres buzz among attorneys that if you work for us, you get to use a Mac," says the CIO, speaking on condition of anonymity.
Pickering says employees can use the refresh cycle to help justify a Mac. "If youre willing to extend your refresh by a year, then you can have a Mac," he says. "Well get the payback on hardware costs."
Can IT Support the Mac?
Another Mac barrier to entry is an unprepared IT staff. When Pickering decided to support Macs in the enterprise, he first needed to find someone on his 20-person IT staff willing to get up to speed on the Mac. A network admin in Connecticut took up the challenge.
Pickering gave the admin a Mac. In return, the admin promised to learn as much as he could about the Mac, bring Macs into Active Directory, and take all Mac-related support calls. Pickering would back him up as the go-to-Mac guy.
Today, Pickerings help desk staff has picked up Mac lessons and can provide some support. As the number of Macs continues to grow, hes looking to add another Mac specialist to augment the frontline support team, perhaps someone within the team. "Ive got no end of people raising their hands and asking for Macs inside IT," Pickering says.
Learning the tricks of another OS isnt easy. For instance, a systems admin and Mac tech for a 40-employee company, speaking on condition of anonymity, says moving from a Mac-only environment to a mixed one required a lot reading.
"The burden of two operating systems is mostly the sheer span of knowledge involved and the time available to study or play with them," he says. "Right now, Ive got Mac OS X 10.6 and Windows 7 running on two machines, and two manuals over 800 pages each for me to get at least acquainted with. Then theres the differences between Office 2007 for Win and Office 2008 for Mac, and so on."
Are Mac Apps Enterprise Ready?
Like IT workers, Mac apps face a learning curve, too.
Consider the systems admin, who says his companys growth spurt five years ago necessitated a move to Windows. "We needed to move up to enterprise scale email," he says. "Macs at the time had nothing seriously well regarded for the enterprise--DNS, Exchange, Active Directory."
Its very difficult to run a Mac-only environment, agrees Pickering, due to compatibility issues cropping up. "What is your email platform? Group calendaring? Group scheduling?" he asks.
Meanwhile, Windows desktop management software vendors may offer a Mac version but many dont work well, say Mac engineers. Getting good enterprise-class support for Mac features from Windows developers can be problematic at times, too, they say.
Some apps just flat out dont work well on the Mac. In one of five little known surprises about Macs, Healthcare IP Partners Kelly relates a story about a bad Mac app. He had been using GoToMeeting, a Web conferencing tool, when rival Cisco WebEx came out with a great deal. So Kelly switched to WebEx--and it regularly hung up on the Mac when hosting a conference.
Avi Learner, an Apple certified consultant, has had similar experiences. "Cisco products are notoriously hostile towards Macs, even the VPN dial up tool," Learner says. "Ive never heard why, but I experience it in the field all the time."
To be fair, the anonymous systems admin says hes been using Cisco VPN on three Macs for about three years with excellent performance. CIOs including Pickering also say that many Windows apps run better on Macs in a virtual environment than they do on a PC.
Will a Mac Open the Floodgates?
When Pickering asked for a Mac as a condition of his employment four years ago, he recalls, the CFO agreed with a caveat: "You cant convert the whole environment to Macs."
Thats a fear many executives share. If my co-worker has a Mac, the thinking goes, why cant I have one? Pickering, though, isnt concerned that Macs will one day trump Windows in the enterprise. "By and large, my end users really dont care what theyre using," he says.
http://www.computerworlduk.com/technology/operating-systems/mac-os/how-to/index.cfm?articleid=3223
Tuesday, February 3, 2015
How to Change Facebook Theme in Urdu and Hindi

Conclusion :
Six programming paradigms that will change how you think about coding
This is not your grandmas "functional programming will change the world!" blog post: this list is much more esoteric. Id wager most readers havent heard of the majority of the languages and paradigms below, so I hope you have as much fun learning about these new concepts as I did.
Note: I have only minimal experience with most of the languages below: I find the ideas behind them fascinating, but claim no expertise in them, so please point out any corrections and errors. Also, if youve found any new paradigms and ideas not covered here, please share them!
Update: this post hit the front page of r/programming and HN. Thank you for the great feedback! Ive added some corrections below.
Concurrent by default

Lets kick things off with a real mind bender: there are programming languages out there that are concurrent by default. That is, every line of code is executed in parallel!
For example, imagine you wrote three lines of code, A, B, and C:
In most programming languages, A would execute first, then B, and then C. In a language like ANI, A, B, and C would all execute at the same time!
Control flow or ordering between lines of code in ANI is merely a side effect of explicit dependencies between lines of code. For example, if B had a reference to a variable defined in A, then A and C would execute at the same time, and B would execute only after A finished.
Lets look at an example in ANI. As described in the tutorial, ANI programs consists of "pipes" and "latches" that are used to manipulate streams and data flows. The unusual syntax is tough to parse, and the language seems dead, but the concepts are pretty interesting.
Heres a "Hello World" example in ANI:
In ANI terminology, we are sending the
"Hello, World!"
object (a string) to the std.out
stream. What happens if we send another string to std.out
?Both of these lines of code execute in parallel, so they could end up in any order in the console. Now, look what happens when we introduce a variable on one line and reference it later:
The first line declares a "latch" (latches are a bit like variables) called
s
that contains a string; the second line sends the text "Hello, World!"
to s
; the third line "unlatches" s
and sends the contents to std.out
. Here, you can see ANIs implicit program sequencing: since each line depends on the previous one, this code will execute in the order it is written.The Plaid language also claims to support concurrency by default, but uses a permissions model, as described in this paper, to setup control flow. Plaid also explores other interesting concepts, such as Typestate-Oriented Programming, where state changes become a first class citizen of the language: you define objects not as classes, but as a series of states and transitions that can be checked by the compiler. This seems like an interesting take on exposing time as a first class language construct as discussed in Rich Hickeys Are we there yet talk.
Multicore is on the rise and concurrency is still harder than it should be in most languages. ANI and Plaid offer a fresh a fresh take on this problem that could lead to amazing performance gains; the question is whether "parallel by default" makes concurrency easier or harder to manage.
Update: the description above captures the basic essence of ANI and Plaid, but I used the terms "concurrent" and "parallel" interchangeably, even though they have different meanings. See Concurrency Is Not Parallelism for more info.
Dependent types

Example languages: Idris, Agda, Coq
Youre probably used to type systems in languages like C and Java, where the compiler can check that a variable is an integer, list, or string. But what if your compiler could check that a variable is "a positive integer", "a list of length 2", or "a string that is a palindrome"?
This is the idea behind languages that support dependent types: you can specify types that can check the value of your variables at compile time. The shapeless library for Scala adds partial, experimental support (read: probably not ready for primetime) for dependent types to Scala and offers an easy way to see some examples.
Here is how you can declare a
Vector
that contains the values 1, 2, 3 with the shapeless library:This creates a variable
l1
whos type signature specifies not only that its a Vector
that contains Ints
, but also that it is a Vector
of length 3. The compiler can use this information to catch errors. Lets use the vAdd
method in Vector to perform a pairwise addition between two Vectors
:The example above works fine because the type system knows both
Vectors
have length 3. However, if we tried to vAdd
two Vectors
of different lengths, wed get an error at compile time instead of having to wait until run time!Shapeless is an amazing library, but from what Ive seen, its still a bit rough, only supports a subset of dependent typing, and leads to fairly verbose code and type signatures. Idris, on the other hand, makes types a first class member of the programming language, so the dependent type system seems much more powerful and clean. For a comparison, check out the Scala vs Idris: Dependent Types, Now and in the Future talk:
Formal verification methods have been around for a long type, but were often too cumbersome to be usable for general purpose programming. Dependent types in languages like Idris, and perhaps even Scala in the future, may offer lighter-weight and more practical alternatives that still dramatically increase the power of the type system in catching errors. Of course, no dependent type system can catch all errors due to to ineherent limitations from the halting problem, but if done well, dependent types may be the next big leap for static type systems.
Concatenative languages
![]() |
cat |
Ever wonder what it would be like to program without variables and function application? No? Me neither. But apparently some folks did, and they came up with concatenative programming. The idea is that everything in the language is a function that pushes data onto a stack or pops data off the stack; programs are built up almost exclusively through functional composition (concatenation is composition).
This sounds pretty abstract, so lets look at a simple example in cat:
Here, we push two numbers onto the stack and then call the
+
function, which pops both numbers off the stack and pushes the result of adding them back onto the stack: the output of the code is 5. Heres a slightly more interesting example:Lets walk through this line by line:
- First, we declare a function
foo
. Note that functions in cat specify no input parameters: all parameters are implicitly read from the stack. foo
calls the<
function, which pops the first item on the stack, compares it to 10, and pushes eitherTrue
orFalse
back onto the stack.- Next, we push the values 0 and 42 onto the stack: we wrap them in brackets to ensure they get pushed onto the stack unevaluated. This is because they will be used as the "then" and "else" branches (respectively) for the call to the
if
function on the next line. - The
if
function pops 3 items off the stack: the boolean condition, the "then" branch, and the "else" branch. Depending on the value of the boolean condition, itll push the result of either the "then" or "else" branch back onto the stack. - Finally, we push 20 onto the stack and call the
foo
function. - When all is said and done, well end up with the number 42.
This style of programming has some interesting properties: programs can be split and concatenated in countless ways to create new programs; remarkably minimal syntax (even more minimal than LISP) that leads to very concise programs; strong meta programming support. I found concatenative programming to be an eye opening thought experiment, but Im not sold on its practicality. It seems like you have to remember or imagine the current state of the stack instead of being able to read it from the variable names in the code, which can make it hard to reason about the code.
Declarative programming
![]() |
GNU Prolog |
Declarative programming has been around for many years, but most programmers are still unaware of it as a concept. Heres the gist: in most mainstream languages, you describe how to solve a particular problem; in declarative languages, you merely describe the result you want, and the language itself figures out how to get there.
For example, if youre writing a sorting algorithm from scratch in C, you might write the instructions for merge sort, which describes, step by step, how to recursively split the data set in half and merge it back together in sorted order: heres an example. If you were sorting numbers in a declarative language like Prolog, youd instead describe the output you want: "I want the same list of values, but each item at index
i
should be less than or equal to the item at index i + 1
". Compare the previous C solution to this Prolog code:If youve used SQL, youve done a form of declarative programming and may not have realized it: when you issue a query like
select X from Y where Z
, you are describing the data set youd like to get back; its the database engine that actually figures out how to execute the query. You can use the explain command in most databases to see the execution plan and figure out what happened under the hood.The beauty of declarative languages is that they allow you to work at a much higher level of abstraction: your job is just to describe the specification for the output you want. For example, the code for a simple sudoku solver in prolog just lists out what each row, column, and diagonal of a solved sudoku puzzle should look like:
Here is how you would run the sudoku solver above:
The downside, unfortunately, is that declarative programming languages can easily hit performance bottlenecks. The naive sorting algorithm above is likely
O(n!)
; the sudoku solver above does a brute force search; and most developers have had to provide database hints and extra indices to avoid expensive and inefficient plans when executing SQL queries.Symbolic programming

Example languages: Aurora
The Aurora language is an example of symbolic programming: the "code" you write in these languages can include not only plain text, but also images, math equations, graphs, charts, and more. This allows you to manipulate and describe a large variety of data in the format native to that data, instead of describing it all in text. Aurora is also completely interactive, showing you the results from each line of code instantly, like a REPL on steroids.
The Aurora language was created by Chris Granger, who also built the Light Table IDE. Chris outlines the motivation for Aurora in his post Toward a better programming: some of the goals are to make programming more observable, direct, and reduce incidental complexity. For more info, be sure to see Bret Victors incredible talks: Inventing on Principle, Media for Thinking the Unthinkable, and Learnable Programming.
Update: "symbolic programming" is probably not the right term to use for Aurora. See the Symbolic programming wiki for more info.
Knowledge-based programming

Examples: Wolfram Language
Much like the Aurora language mentioned above, The Wolfram Language is also based on symbolic programming. However, the symbolic layer is merely a way to provide a consistent interface to the core of the Wolfram Language, which is knowledge-based programming: built into the language is a vast array of libraries, algorithms, and data. This makes it easy to do everything from graphing your Facebook connections, to manipulating images, to looking up the weather, processing natural language queries, plotting directions on a map, solving mathematical equations, and much more.
I suspect the Wolfram Languages has the largest "standard library" and data set of any language in existence. Im also excited by the idea that Internet connectivity is an inherent part of writing the code: its almost like an IDE where the auto-complete function does a google search. Itll be very interesting to see if the symbolic programming model is as flexible as Wolfram claims and can truly take advantage of all of this data.
Update: although Wolfram claims the Wolfram Language supports "symbolic programming" and "knowledge programming", these terms have slightly different definitions. See the Knowledge level and Symbolic Programming wikis for more info.