Skip site navigation (1)Skip section navigation (2)
Date:      Fri, 21 Apr 2023 18:01:45 -0400
From:      Aryeh Friedman <aryeh.friedman@gmail.com>
To:        Mario Marietto <marietto2008@gmail.com>
Cc:        questions@freebsd.org, FreeBSD Mailing List <freebsd-hackers@freebsd.org>
Subject:   Re: Installing openAI's GPT-2 Ada AI Language Model
Message-ID:  <CAGBxaX=mgbx3xjxCyYySR1XGNG013TkebXJ7o8QNak=0HqGyhQ@mail.gmail.com>
In-Reply-To: <CA%2B1FSii7q0vZ4v9R94-G-=G%2B7JxYhS_h%2B85ApMce0XJ7MCAc9w@mail.gmail.com>
References:  <CAGBxaXmhRLk9Lx_ZHeRdoN-K2fRLEhY3cBVtBymmAjd4bBh1OQ@mail.gmail.com> <20230421134120.GA12251@darkbeer.org> <CAGBxaXm%2BSBaiKyvk28%2BKiO1bqP4gfFjMDZz3xB5wbexb6gJUvQ@mail.gmail.com> <CA%2B1FSii7q0vZ4v9R94-G-=G%2B7JxYhS_h%2B85ApMce0XJ7MCAc9w@mail.gmail.com>

next in thread | previous in thread | raw e-mail | index | archive | help
On Fri, Apr 21, 2023 at 5:37=E2=80=AFPM Mario Marietto <marietto2008@gmail.=
com> wrote:
>
> 1) try to use the linuxulator instead of bhyve (try to remember what Albe=
rt Einstein tells about those people that want to solve a problem without c=
hanging their perspective : he does not spend nice words for those people)

For long term deployment options (eventually I want a VM image people
can just download and run) running it on the host in the long term is
not a good idea and due to lack of hardware resources (and budget) the
only non-production/development platform that I can use is a VM on my
production VM host (has 3 other production machines on it).   Thus
*ANY* solution that does not involve VM's is a non-go.

>
> 2) actually an nvidia gpu can be passed-thru within linux (and within win=
dows 11 if the gpu is amd)

See above we don't want to force people to have specific hardware.
If I remember right, weren't you the person I was helping late last
year to setup pass-thru on bhyve for your nvidia and it turned out to
need a bhyve patch (I assume from what you have said this was for the
2 os's you mentioned).   If I remember right this was specifically so
you could set up GPT-2.

>
> 3) the tutorial that we are suggesting may work even if you don't want to=
 use your gpu

I was pointed to another tutorial privately that explains how to do it
on a VM (I hope ;-))

>
> 4) probably you should find the exact version of pytorch that it wants

According to chatGPT (and other sources) it will work on anything
newer than 1.13 but does not specify anything beyond that.



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?CAGBxaX=mgbx3xjxCyYySR1XGNG013TkebXJ7o8QNak=0HqGyhQ>