Look at it as an ratio of data bandwith from server to client and replicated
computation. One extreem is low bandwith high replicated computation. Replicated complex simulations is in this category. The other extreem is high bandwith low replication computation. High frequency of setter (and getter) method invokation. And of course the continum between is possible. This of course depends on the Island and how much replacted computation it does. One intresting idea would be that an parcel of Second Life is mirrored into an Croquet Island. While I still recall: has anyone made aviable use of http accessable resources? (like loading textures, meshes, sounds and such) -Zarutian On Tue, 06 Jun 2006 15:45:56 -0700 Darius Clarke <[hidden email]> wrote: > 2nd Life model is probably required to keep the one continuous world > environment for the most many to many interactions. The continuous > landscape that has only one map and in which one can fly up high > allows one to quickly and easily find something that peeks their > interest (and pocketbook). > > The other online games require an avatar to stay on one server. > > Croquet requires each avatar's owner to bring to the table a beefy PC > (a server technically) that can simulate and run the /entire/ "world" > that the avatar is in. That's a 1 to 1 avatar to server ratio. Which > is why Croquet wouldn't do well in one single massive world with > massive interactivity. Going through portals is a slower way to scan > what worlds and content are available. > > 2nd life uses fog to limit the view and the computations & 2D limits > and the programming rules for a certain parcel of land. > > Croquet uses portals to limit the views and computations. Which is > better? Depends on the goal. Since, in Croquet, every PC with an > avatar in a world recomputes everything in that world, the duplicate > processing could be considered CPU waste from a server perspective. > > I still prefer the Croquet solution though, for the loosely coupling > of adding and removing worlds and the responsibility one has for a few > worlds where they can completely control the rules including how light > behaves ... less likely to suffer from the "Tragedy of the Commons". > > Darius > _______________________________ Lestu blöðin á Vísir Vefblöð! DV,Sirkus,Fréttablaðið,Markaðurinn,Hér&Nú og Birta FRÍTT á http://www.visir.is |
I look at it a little differently than Darius described. I figure
that everybody does (or can do) tiled worlds. I'm excited that David's TPortal design may provide some interesting possibilities for being able to see what's in neighboring worlds (e.g., no need to obscure by fog), but this remains to be demonstrated. Within a single world, I've repeatedly seen about ten or twelve users in a world without problems. (We can scrape together that many machines in our lab.) I haven't tried more, although I think Mark McCahill may have witnessed some problems with more machines. Of course, there really hasn't been much tuning of Croquet at this point beyond Andreas' and David's original work. My practical engineering experience has been that tuning is typically worth an order of magnitude. (Note that's not necessarily 10 times!) In the LONG run, full tea-time with messages between islands presents a lot of possibilities for adjusting load. E.g., everybody's playing together in a large world, but some objects in the world are "uni- islands" (an island with one interesting object that is not a space) that not everyone needs or wants to join. But until then, with simplified tea time, there are still a lot of special-case things that can be done. The texture cache is one example. I'm not sure what the question is on http resources: * Our group shares hi-res textures (and maybe soon sound and video?) through a global cache managed with its own router. There's no http involved, unless the user that adds the resource gets it with http in order to have the bits to add to the global cache. (Nice work by Josh!) * The way we expose web pages is to have each replicated island contain a vnc client connected to a server on another machine. That vnc server machine is getting stuff (once) through http. Are either of these what you're asking about? On Jun 6, 2006, at 9:04 PM, [hidden email] wrote: > Look at it as an ratio of data bandwith from server to client and > replicated > computation. > > One extreem is low bandwith high replicated computation. > Replicated complex simulations is in this category. > The other extreem is high bandwith low replication computation. > High frequency of setter (and getter) method invokation. > > And of course the continum between is possible. > > This of course depends on the Island and how much replacted > computation it > does. > > One intresting idea would be that an parcel of Second Life is > mirrored into an > Croquet Island. > > While I still recall: has anyone made aviable use of http > accessable resources? > (like loading textures, meshes, sounds and such) > > -Zarutian > > On Tue, 06 Jun 2006 15:45:56 -0700 > Darius Clarke <[hidden email]> wrote: >> 2nd Life model is probably required to keep the one continuous world >> environment for the most many to many interactions. The continuous >> landscape that has only one map and in which one can fly up high >> allows one to quickly and easily find something that peeks their >> interest (and pocketbook). >> >> The other online games require an avatar to stay on one server. >> >> Croquet requires each avatar's owner to bring to the table a beefy PC >> (a server technically) that can simulate and run the /entire/ "world" >> that the avatar is in. That's a 1 to 1 avatar to server ratio. Which >> is why Croquet wouldn't do well in one single massive world with >> massive interactivity. Going through portals is a slower way to scan >> what worlds and content are available. >> >> 2nd life uses fog to limit the view and the computations & 2D limits >> and the programming rules for a certain parcel of land. >> >> Croquet uses portals to limit the views and computations. Which is >> better? Depends on the goal. Since, in Croquet, every PC with an >> avatar in a world recomputes everything in that world, the duplicate >> processing could be considered CPU waste from a server perspective. >> >> I still prefer the Croquet solution though, for the loosely coupling >> of adding and removing worlds and the responsibility one has for a >> few >> worlds where they can completely control the rules including how >> light >> behaves ... less likely to suffer from the "Tragedy of the Commons". >> >> Darius >> > > _______________________________ > > Lestu blöðin á Vísir Vefblöð! > DV,Sirkus,Fréttablaðið,Markaðurinn,Hér&Nú og Birta FRÍTT á http:// > www.visir.is > |
In reply to this post by Baldur Jóhannsson
On Wed, Jun 07, 2006 at 09:13:14AM -0500, Howard Stearns wrote:
> Within a single world, I've repeatedly seen about ten or twelve users > in a world without problems. (We can scrape together that many > machines in our lab.) I haven't tried more, although I think Mark I presume this is based on a LAN, right? How many users on ADSL-connected machines (upstream 128 kBit) are feasible? There's advantage in centralism with the current kind of infrastructure, though in future 100 MBit/s symmetrical bandwidth and very fast machines on each user's end may tip the balance back to P2P again. In a mirror world approach, tesselation of domains over of a virtual Earth would follow node distribution over physical space, since routing would have to be geography-aware, and local bandwidth at low latency would be easily available on the local loop. -- Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org ______________________________________________________________ ICBM: 48.07100, 11.36820 http://www.ativel.com 8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE signature.asc (198 bytes) Download Attachment |
In reply to this post by Baldur Jóhannsson
On Jun 7, 2006, at 9:53 AM, Eugen Leitl wrote: > On Wed, Jun 07, 2006 at 09:13:14AM -0500, Howard Stearns wrote: > >> Within a single world, I've repeatedly seen about ten or twelve users >> in a world without problems. (We can scrape together that many >> machines in our lab.) I haven't tried more, although I think Mark > > I presume this is based on a LAN, right? Multiple local high-speed LANs being used simulaneously. Some were 802.11g wireless. I've also been connecting from home (slow 802.11b wireless to a consumer router to a cable-modem throttled down by local bottleneck provider) with excellent results. > How many users on ADSL-connected > machines (upstream 128 kBit) are feasible? I think a slow client (whether because of network or machine) does not directly effect the other users, nor does it directly effect how many users can be connected. There are secondary (e.g., social) effects, however. For example, suppose someone is operating under high latency. They might turn WAY more than they want to because they don't see anything happening. Then when it does happen, they need to correct their over-turn. People trying to interact with such folks will have a crummy experience. Also, this creates more traffic. If everybody is having this problem, there is more traffic than is really needed or desired, and this can effect the number or users that the router can support. > > There's advantage in centralism with the current kind of > infrastructure, > though in future 100 MBit/s symmetrical bandwidth and very fast > machines > on each user's end may tip the balance back to P2P again. In a mirror > world approach, tesselation of domains over of a virtual Earth would > follow node distribution over physical space, since routing would > have to be geography-aware, and local bandwidth at low latency would > be easily available on the local loop. > > -- > Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org > ______________________________________________________________ > ICBM: 48.07100, 11.36820 http://www.ativel.com > 8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE |
Free forum by Nabble | Edit this page |