Friday, December 16th, 2005

Business Logic: Server or Browser?

Category: Programming

Eric Pascarello says keep business logic in the server for speed and security.


(A)nything that is rendered in a browser is basically “Open Source” since there is no way to really protect it. All of this information is “downloaded” into the cache so everyone has a copy of it. You can destroy all the formatting by eliminating returns and changing variable names, but it still can be deciphered by anyone that has time. So basically if security is very important for you, than your business logic has to be on the server.


Get a person with 10 browser windows open, word and excel, acrobat reader, outlook, media player, and those spyware apps running and say bye to the memory and speed … Plus most servers are built to crunch data. Why use a browser running JavaScript that has poor memory management

Is this always the case? Or are there some situations that warrant smarter clients?

Posted by Michael Mahemoff at 2:04 pm

3.1 rating from 7 votes


Comments feed

If your application is public facing you should never trust the client.

Assume that in every request you will recieve data intended to cause damage to the system. The server should define precisely what the data is and exactly how it should look, anything not matching those rules should be ignored completely.

Putting business logic on the side of the client just because you can is a bad idea.

Comment by Dan Kubb — December 16, 2005

This seems like common sense. You should try to follow the model-view-controller architecture, with your view in the browser using HTML, CSS, JavaScript, etc and most of the controller and all of the model on the server.

Comment by Tom Robinson — December 16, 2005

For some kinds of AJAX applications having to have the client go back to the server for every controller and model update can get very slow. For example, imagine if the Controller and View portions of the new Yahoo Mail client were on the server side; every time you pressed a button, it had to tell the server that the click occurred, causing a new view to be generated which gets dynamically inserted into the user interface (this is how Ruby on Rails works by default, by the way). This could really impact performance.

Comment by Brad Neuberg — December 16, 2005

The subject(s) Eric Pascarello is discussing are valid, even though I find the article not enough defined and to quick on the assumptions.

I think in this matter, one should separate Post from Get methods. About the Get method, I’ll use an analogy I have used many times:
Consider the server as a dealer at a Black Jack table and the clients/visitors/browsers as players. Each time a card has to be delt, the dealer has to get a template card, paint it with the right picture and than deal it for each player. There is a faster alternative for this process:

The dealer gives the player the template(XSL) and the picture (XML) and the player can paint it himself. Now, with the help of Ajax, the client can obtain new data from the dealer and parse it with the XSL obtained earlier. Do you see the gain of speed? The work is distributed and the dealer can handle many more players. Another positive effect is that the network is less traficated, than if a parsed big HTML-file is transported. Think about it…or even, make calculations if you don’t believe me.

When it comes to Posting or sending data for updating the database; I agree that there is a security issue; but no more than “classic postback web application” (consider the CAPTHCA-functionality for forms). The security issues are still the same (if programmed smartly), aspecially if the web application is public, without login.

In the article, Eric writes:
-“Since security is an easier topic to cover I will start there.”
This is a little annoying and badly formulated sentence. Security is not an easier topic, it is the hardest one.

Last but not least; 1 server + 1000 clients can render/parse data faster than 1 server for 1000 clients any day. In my opinion it is better that 50 clients feel slowness (for local reasons) than 1000 clients pulling down the speed because the poor dealer has been charged with all the duties (global reasons).

The old cliché; “the world is not black or white, there are gray nuances”. It is wrong to say that the logic should resize serverside or clientside. Business logic can be distributed, if it isn’t important. The classical developer will speak of the benefits of serverside logic, but their solutions perform poorly (this, they don’t speak loud about or remember in the next project; they just put in more hardware into the server).

Comment by Hakan Bilgin — December 16, 2005

I would like to complete my previous comment with things to think about. I don’t think that Eric Pascarello really thought through what he is saying “major no no” to. I have seen articles on different forums, about experiments of distributed logic, so I don’t think I am alone to test this approach.

Semi-distributed bussiness logic has resulted for me:
– Single serverside file (~8Kb); all Ajax calls are made against this file
– Faster response from the server
– Agility; ability to switch between different server configurations (LAMP, WAMP, JSP-Oracle, ASP-MS SQL; since it’s a single file, it can easily re-written for other languages)

But also have in mind, this approach isn’t yet suitable in all cases. My point is just that; you shouldn’t oppose to something that you haven’t tested it out fully.

Comment by Hakan Bilgin — December 19, 2005

At the end of the day, speed, security, flexibility and scalability have to be the most fundamental factors a techie must address when building any application particulalry online. I have been involved in online development since the birth of the industry and have seen many fads come and go as well as many ways of programming them.

In my opinion AJAX is to techies what FLASH has been for designers – a clever tool that can get out of control VERY quickly to the point of unusability. The term “Design Masturbation” has been applied many times to flash and I think “Technology Masturbation” will be applied liberally as well in the coming year or so when everyone seems to be doing it…

AJAX proponents typically are seeking to try and break down the MVC paradigm which in itself isn’t a bad thing – where it is bad is not presenting a replacement that works properly in a large number of cases.

The Model-View-Controller paradigm is a conerstone of computing science and is shown to be a method of addressing the points that I started off with speed, security, scalability and flexibility. Code too much of your business logic into the client-side and you will lose out on many of these things.

Likewise there is the opposite extreme whereby too much work can be done on the server end – thus burdening down your server. If you have a bad algorithm it makes it even worse, but this is where we go back to solid, ,dependable computer science solutions and procedures for writing or choosing an algorithm O(N), O(log N)etc. As a techie I see too many people within the new media industry very keen to “chuck out” the learnings we have had from over 60 years of computer science research into exactly these areas.

At the end of the day the client / server argument is an old one and only the individual project can dictate the right balance. As a commercial developer though I leave with a parting comment:

My clients primarily sell products whether physical or virtual [information], they want web applications that are:-
– fast to download [hence potentially a call for AJAX style non-page-refreshes],
– secure [which can be achieved AJAX or no], they want them reliable [simple code = less problems maintaining]
– cost effective [more fiddly code = more up front costs and increased maintenance costs]
– flesible [a solution for a long term rather than this week]
– and maybe most importantly of all they want them accessible [the more people that can see something the more people can buy something!].

It is perhaps this last that has made me shy away from this technology for the moment.

Comment by Andrew Fisher — December 19, 2005

I created a JavaScript Wrapper for my C# Business Logic classes. You specify which dll to “wrap” and the wrapper spits out javascript.

The result is that you can write near-C# language in JavaScript! I have all the public classes, constructors, public constants, public methods and public properties available client side in javascript.

The methods and constructors are just “empty” functions that check if the correct number of parameters are specified and then perform a synchronous web service call. The web service then invokes the construtor or method and returns the resulting value.

This means that you don’t need to write hundreds of web services and web service methods when you need to interact with the business entities and business logic.

In respect to security the wrapper only wraps public classes and namespaces. And furthermore it checks that a [Wrap] attribute is present.

For even more security WSE provides a mechanism to digitally sign SOAP messages.

Comment by Troels Wittrup Jensen — February 19, 2006

Leave a comment

You must be logged in to post a comment.