Originally posted by mrbouffant
View Post
Desirable attributes would be:
1. Fairness
2. Ease of use
3. Optimum ticket sales for the vendor(s)
4. Effectiveness of the processing
Re Fairness, doing an allocation based on first come first served process is not necessarly fair, and if getting into the system is so haphazard as it seems to have been, then this suggests that Fairness is not a high priority by the developers.
Ease of use factors might be seen to have been poor. Some people are more forgiving, but trying to find ways to make the system work using multiple computers, ipads etc. hardly suggests that everybody found it an easy process.
We don't know about the optimisation of ticket sales for the vendors. If each ticket is viewed independently from the others, then all solutions are pretty much the same, but if we get more complex requests then the final allocations may not be optimal for the vendors. The input data doesn't really allow this, but real time behaviour during the 30 mins for each on line purchase might. For example:
I want 10 seats for concert X at price A.
I want 4 seats for concert Y at price B
I want 15 seats for concert Z at price C
If all the seats are allocated then the revenue for the vendor is 10A+4B+15C.
However, the user may not buy any seats if there are not 10 contiguous seats for concert X. This would represent a potential loss of seat sales by the vendor(s) to the value of 10A.
If allocations are firm, then other users may already have been allocated seats, thus preventing me from obtaining the 10 contiguous seats.
If we imagine running the allocation process many times, then sometimes there will be contiguous seats at the time at which my request is processsed, and the vendor will achieve more sales based on my allocation, while on other runs my request for those seats will be blocked, and both I and the vendor lose out.
On balance a pragmatic approach is that it will all come out in the wash anyway, the vendor has many prospective purchasers for seats, and may be happy with a sub optimal allocation, as the sales will be high enough.
In order to achiave an optimal solution using this kind of analysis, a lot of information about requests and preferences needs to be available in advance. Ideally all the preferences need to be known even before even one ticket is allocated. However, the real world isn't like that.
A slightly similar problem, if we are trying to consider optimal allocations is the Stable Marriage Problem - http://en.wikipedia.org/wiki/Stable_marriage_problem I am not claiming that this is the same problem, it merely has some similarities. Solutions to that problem are known, but if there are N matches to be made the solutions are O(N^2). Suppose N is 100,000 (I'll come back to that later). In that case the processing time is proportional to 10,000,000,000 times some basic time unit - whatever that is. Modern computer systems are fast, but they can still be nobbled by problems which explode as they scale up. Suppose we consider a basic time unit of 1/100 of a second. With the (speculative) values here, it would take 100,000,000 seconds to get the job done - which is over 3 years!
With algorithms like that, what value of N would give an acceptable processing time? Turns out a value of N=1000 gets the job done in under 3 hours - about 2.778 hours.
N=10000 gives over 11 days - about 11.6 days.
So, if we are looking for solutions which satisfy some fairly hard optimality requireents, the processing time can become very large if there are a lot of matches to do, and the problem cannot be solved in reasonable time. How can this be resolved? Several ways - reduce the number of matches, reduce the conditions on optimality - so that sub optimal solutions can be considered acceptable, and several other ways.
What about the magic value of N? Well, the RAH has approximately 5000 seats, and there are slightly fewer than 80 concerts in the Proms series - and not all of them are in the RAH. From the vendor's point of view each possible seat should be sold - and that is about 400,000 tickets.
Now let's tackle this from another direction. Suppose there is demand for all the tickets on the opening booking day. We in fact know that this is not the case, as only a few concerts are actually sold old. If we have algorithm which is O(N) - i.e linear, rather than quadratic, how long would the processing take now?
Again, assume a basic time unit of 1/100 sec, then if we choose N=400,000, then the operational time would be 4000 seconds, which is 1.111 ... hours - under an hour and a quarter.
Where does all this fit in with reality? The indications are that trying to do a complex matching process to give optimal results for both the vendors and buyers may not be feasible within the time available. A much simpler matching process could be done in a reasonable time.
The existing "solution" does seem to fail with respect to fairness - as the access to the waiting room seems to be very haphazard. Personally I don't see any reason why some form of batch processing wouldn't be acceptable, and could be achieved within a reasonable time - say 1 day. Putting in reasonable requests in a suitably encoded form and then having all the requests by all customers available processed with allocations notified a day later would not be unreasonable IMO, and could be fairer and a lot less frustrating for customers, so in fact I disagree that the current method is preferable to earlier off line methods.
The current system does not have any way of a user expressing significant preferences - such as "If I can't get all of my seats for concert A, then I would like to go to concert B, D etc." Other booking systems do allow or require users to state a ranking of preferencess, which could lead to fairer outcomes, otherwise some users may get all of their frist preferences, while others may get nothing - not even their lowest preference tickets.
We also now know that if only a few concerts are fully sold out, that those were the ones which were in some ways problematic. They could have been anticipated in advance - either based on previous experiencs, or (and also) based on an analysis of data from the Proms Planner or similar earlier data gathering.
Trying to mix online - real time input with offline processing may be a mistake too. Some users were perhaps trying to make their selections on a web based system in real time, while other users had their allocations dealt with using input from the Proms Planner. That could have been made almost completely an off line process - why did users have to log on to do that?
Comment