Skip site navigation (1)Skip section navigation (2)
Date:      Sat, 2 Aug 2003 19:55:17 +1000 (EST)
From:      Bruce Evans <bde@zeta.org.au>
To:        Julian Elischer <julian@elischer.org>
Cc:        Marcel Moolenaar <marcel@xcllnt.net>
Subject:   Re: NVidia glx stuff dies in sysarch(I386_SET_LDT, ...)
Message-ID:  <20030802195010.S2520@gamplex.bde.org>
In-Reply-To: <Pine.BSF.4.21.0308011722550.46065-100000@InterJet.elischer.org>
References:  <Pine.BSF.4.21.0308011722550.46065-100000@InterJet.elischer.org>

next in thread | previous in thread | raw e-mail | index | archive | help
On Fri, 1 Aug 2003, Julian Elischer wrote:

> On Fri, 1 Aug 2003, Julian Elischer wrote:
> > I also noticed that if we disable the 'splat' mode, we'd break sysVR4
> > binary code as they do that.. (though it's #if 0'd out at the moment)
>
> not to mention linux (more important..) though I might add that that
> code could do with rewriting to get rid of a lot of "stackgap" stuff.

Even changing the semantics of i386_set_ldt() to support dynamic allocation
may break ABI compatibility with Linux (not to mention both API and ABI
compatibility with OtherBSD).  I wondered about this when the change was
proposed, but only had a quick look at an old version of Linux for API
compatibility.  Linux seem to have a different API.

Bruce



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?20030802195010.S2520>