Switch Theme:

What next for Forums/online discussion?  [RSS] Share on facebook Share on Twitter Submit to Reddit
»
Author Message
Advert


Forum adverts like this one are shown to any user who is not logged in. Join us by filling out a tiny 3 field form and you will get your own, free, dakka user account which gives a good range of benefits to you:
  • No adverts like this in the forums anymore.
  • Times and dates in your local timezone.
  • Full tracking of what you have read so you can skip to your first unread post, easily see what has changed since you last logged in, and easily see what is new at a glance.
  • Email notifications for threads you want to watch closely.
  • Being a part of the oldest wargaming community on the net.
If you are already a member then feel free to login now.




Made in gb
Calculating Commissar





England

I am a fan of "open book" exams, had a few towards the end of medical school aimed at applying your skills and knowledge from earlier years.

 ChargerIIC wrote:
If algae farm paste with a little bit of your grandfather in it isn't Grimdark I don't know what is.
 
   
Made in gb
[SWAP SHOP MOD]
Killer Klaivex







I teach at a world leading department in a Russell Group university and AI is a genuine developing problem. Two years ago, an AI essay was immediately identifiable. It hallucinated references and often spat out crap that made no sense. About a year ago, you started being able to attach specific documents and have it work from those - and the quality of output immediately shot up. AI is now capable of putting out a humanities/social sciences paper of about a high 2:2/low 2:1 quality by a first year grading rubric.

The thing is, I can still spot them. My own brain is a finely honed computer that's read thousands of essays. I can pick them out by how they phrase, how they use certain key words, how they structure and format. There are too many similarities in AI style. Last year's end of term assignments contained about 12 AI essays out of 50 for one of my modules however, as students believe the sales patter and are often lazy. What did I do? I picked them apart, blasted them on everything I could think, and gave them thirds. Lower than what the paper deserved, but more than they deserved.

I did that without even mentioning AI because my intent is to make them realise that the AI cannot do it all for them. To shock them into not doing it again. You might ask why, because it clearly CAN write an essay for them. But here's the key.

I specified that it writes a high 2:2/low 2:1 essay by a FIRST YEAR rubric. Second and third year get more difficult because we start assigning more weight to critical analysis, whereas first year is more structural/formatting and teaching them how to read academically. If they skip the first year's learning by relying on the AI, they're fethed for second and third years. The AI's are just fine with putting together a cursory surface level overview, but their ability to actually make connections and fit information together is poor.

The computer will take the jobs of those who skim university and never really learn much, or who didn't have much ability in the first place. All those middling students who drift along and did a degree because it was expected whilst pretending to develop generic 'transferrable skills'. But for the actual hard workers/ones with half a braincell who do it right? The computer doesn't seem likely to imminently depose them.

The problem is when you have elderly staff who don't know anything about it, grade the AI paper in first year with a half decent mark, and consequently encourage them to rely on it. Then second and third year hit and they fail. Their own fault perhaps, but not what we should be trying to achieve from a more holistic view of 'teaching the young'.

This message was edited 6 times. Last update was at 2024/10/08 10:01:05



 
   
Made in de
Joined the Military for Authentic Experience






Nuremberg

The assessment model I thought might defeat AI would be "oral" exams but done using a chat application. This removes bias based on physical attributes. You have the examiner in one room and the student in another communicating in chat. The student is supervised by an invigilator to prevent dishonest conduct.

The problem is the scalability of that system. You can't assess lots of people at once easily, there's a large staffing load.

   
Made in gb
[SWAP SHOP MOD]
Killer Klaivex







 Da Boss wrote:
The assessment model I thought might defeat AI would be "oral" exams but done using a chat application. This removes bias based on physical attributes. You have the examiner in one room and the student in another communicating in chat. The student is supervised by an invigilator to prevent dishonest conduct.

The problem is the scalability of that system. You can't assess lots of people at once easily, there's a large staffing load.


Assuming that it continues to improve, there are two solutions.

The first is a return to blue book exams in a hall. They can't cheat in it, but it also denudes any output of real thought or analysis. Exams like that are good mainly for rote memorisation testing.

The second is one of the two or three methods of recording the writing of the paper so the marker can check it, keystroke by keystroke, and second by second. If someone is just copypasting or even copying by hand from an AI paper, it's pretty obvious. There's an organic flow to drafting a paper, and no AI model currently on the market can mimic that.


 
   
Made in de
Joined the Military for Authentic Experience






Nuremberg

Yeah a bit like the way the captcha systems detect when a human is using a mouse by the various imperfections in the motions. That could work.

It'll be an arms race though, won't it?

   
Made in gb
Calculating Commissar





England

Could universities go back to using vivas more? If someone got AI to write their essay, they probably won't be able to discuss it in great detail and be able to defend it against questioning. If they can anyway, then they are still understanding the material and therefore it matters less what tool they've used.

This message was edited 1 time. Last update was at 2024/10/08 10:27:51


 ChargerIIC wrote:
If algae farm paste with a little bit of your grandfather in it isn't Grimdark I don't know what is.
 
   
Made in de
Joined the Military for Authentic Experience






Nuremberg

Yeah, that's what I mean by oral exams, they do them quite commonly in Germany I believe. My only issue with them is it does open up the potential for various kinds of discrimination, so I like the idea of it being a "blind oral" where the examiner cannot actually see the candidate.

   
Made in gb
Calculating Commissar





England

 Da Boss wrote:
Yeah, that's what I mean by oral exams, they do them quite commonly in Germany I believe. My only issue with them is it does open up the potential for various kinds of discrimination, so I like the idea of it being a "blind oral" where the examiner cannot actually see the candidate.

I think blinded exams is a good idea.

I do think there are potential issues with capable students who are bad at social interactions, but there is probably some kind of reasonable adjustment that can be applied as appropriate. I have a lot of social anxiety and I hated oral and clinical exams in my early years, but by 5th year I was quite comfortable in justifying my decisions verbally and clinical exams were not as stressful. So perhaps courses just need to encorporate training for oral exams/vivas if they become standard.

 ChargerIIC wrote:
If algae farm paste with a little bit of your grandfather in it isn't Grimdark I don't know what is.
 
   
Made in gb
[SWAP SHOP MOD]
Killer Klaivex







Da Boss wrote:Yeah a bit like the way the captcha systems detect when a human is using a mouse by the various imperfections in the motions. That could work.

It'll be an arms race though, won't it?


It's already on the market in a few different forms. Generally speaking, students accept it if you phrase/introduce it as 'you need to be able to prove you wrote the paper' as opposed to 'I need to prove you used AI'. Shifts the responsibility and gets buyin.

But yes, eventually, I don't doubt someone will submit five thousand recordings to an AI, and just make a program that will mimic that organic style too. At which point, we're into the realm of 'you have to use this specific program to write/record', at which point someone will try and write a program to input to that program...

At the end of the day, it's a bit like torrenting. You can put all the effort you want in, but you can't stop a truly determined cheater - but then again, the only person they cheat is themselves. Their work will suck and they'll leave uni without developing any skills at all. And employers will spot that pretty quick.


Haighus wrote:Could universities go back to using vivas more? If someone got AI to write their essay, they probably won't be able to discuss it in great detail and be able to defend it against questioning. If they can anyway, then they are still understanding the material and therefore it matters less what tool they've used.


I've done that in the past to prove AI usage. You call someone in, jab a finger at a bit of the printed out essay, and ask them to explain what they wrote. Repeatedly. Guaranteed breakdown every time.

This message was edited 1 time. Last update was at 2024/10/08 10:39:40



 
   
Made in gb
Calculating Commissar





England

I suppose when it comes down to it, AI can be combatted, but all of these techniques require more time and investigation from educators at a time that educator working conditions and budgets are being slashed. I think there lies the real underlying issue.

 ChargerIIC wrote:
If algae farm paste with a little bit of your grandfather in it isn't Grimdark I don't know what is.
 
   
Made in gb
Decrepit Dakkanaut




UK

The other thing is the way Universities currently work with students scheduling their own worktime, means you can't just lock them in a room to write their essay (and then you just have to watch for them being really sneaky with a mobile phone or such).

That can work at school - plus school work often comes from a textbook so you can even just lock out the internet entirely to a room. No phone, no net and the students HAVE to work with what they've got.


At Uni it "could work" but requires some major fundamental changes and you can argue that it defeats some of the part of uni. It also works against many students who want to work jobs or part time courses and such.

Another option is to simply shift a lot of uni grading away from essays and more into practical work skills. There's even a net gain for students because they leave with more practical working skills in their chosen field than the one they do for their dissertation (since often that's the only skill they can often develop to any level, with the rest being almost more a taste of a skill but not real development).

This does require more input from the staff and more hours teaching, but there's a net gain in pushing out AI and bringing in workskills; which lets face it Uni is NOT about pure academia for most people these days.




Granted some subjects like english are going to be less easily adapted and might have to go the "here's the deadroom take your references in and there's no net get to work".


Actually thinking on that you could have work-rooms that have restricted access (eg just to official documents/papers/references*) but are the only way students and write and submit papers. Open all hours with some casual monitoring and a "no phones" policy etc.


*another net gain it locks out false internet articles


Automatically Appended Next Post:
 Haighus wrote:
I suppose when it comes down to it, AI can be combatted, but all of these techniques require more time and investigation from educators at a time that educator working conditions and budgets are being slashed. I think there lies the real underlying issue.


Yeah this is the big one. Heck I went through a college a while back and budgets were being slashed mid-term, to the point some courses were losing staff part way. You could also feel the insanity of it on the staff as they get more and more work and papertrail stuff they had to do whilst at the same time having their budgets, funds, facilities and everything slashed and working in a "we might have to let you go soon cause money" atmosphere.

This message was edited 1 time. Last update was at 2024/10/08 11:05:18


A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in gb
Calculating Commissar





England

Personally, I think that restricting the freedom of students is the wrong approach and ultimately counterproductive. As Ketara points out, university students are adults and can choose to make lazy shortcuts instead of learning. The challenge is catching them being lazy and making sure they know there are consequences for not actually gaining the skills when they still have the chance to rectify it.

As mentioned if they graduate having cheated through, it will show in most workplaces. Unless they become a journo or politician I suppose.

 ChargerIIC wrote:
If algae farm paste with a little bit of your grandfather in it isn't Grimdark I don't know what is.
 
   
Made in gb
Decrepit Dakkanaut




UK

True, however the risk is that a lot of workplaces just care that you've got a degree and a grade. If "everyone is using AI to get a 1st" even if its only on a few of the course options then you risk having a culture develop where not using AI leaves some students behind others grading wise.


Ergo those putting in the effort end up not getting the reward at the end. Sure you can argue that those who cheated might well end up in jobs that they aren't skilled for, but the key thing is that they will get those jobs and the punishment comes years after they started cheating.


Esp if you consider that most workplaces don't require you to write an essay as your job. So the job already expects students to turn up with a potential level of understanding and skill, but still requiring training to actually do the job itself (which is another whole argument that you can spend a fortune going to uni and then still not get the job because you "lack experience")

This message was edited 1 time. Last update was at 2024/10/08 11:22:47


A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in gb
[SWAP SHOP MOD]
Killer Klaivex







Haighus wrote:I suppose when it comes down to it, AI can be combatted, but all of these techniques require more time and investigation from educators at a time that educator working conditions and budgets are being slashed. I think there lies the real underlying issue.


Definitely. It takes me three times as long to tear apart an AI essay, because I need to justify the grade beyond 'it's AI but I can't prove it 100%'. So it's a certain degree of extra work on my end.

Overread wrote:
Another option is to simply shift a lot of uni grading away from essays and more into practical work skills. There's even a net gain for students because they leave with more practical working skills in their chosen field than the one they do for their dissertation (since often that's the only skill they can often develop to any level, with the rest being almost more a taste of a skill but not real development).

In STEM, they're already more likely to be developing practical skills - but it's pretty hard to come up with generic 'practical' skills in the humanities/socsci beyond research methodology and the existing transferrable skillset (aka, how to find information, ingest it, apply critical thinking, and then output it for a given task).

Haighus wrote:
As mentioned if they graduate having cheated through, it will show in most workplaces. Unless they become a journo or politician I suppose.

Overread wrote:True, however the risk is that a lot of workplaces just care that you've got a degree and a grade. If "everyone is using AI to get a 1st" even if its only on a few of the course options then you risk having a culture develop where not using AI leaves some students behind others grading wise. Ergo those putting in the effort end up not getting the reward at the end. Sure you can argue that those who cheated might well end up in jobs that they aren't skilled for, but the key thing is that they will get those jobs and the punishment comes years after they started cheating.

Esp if you consider that most workplaces don't require you to write an essay as your job. So the job already expects students to turn up with a potential level of understanding and skill, but still requiring training to actually do the job itself (which is another whole argument that you can spend a fortune going to uni and then still not get the job because you "lack experience")


It's worth bearing in mind that an AI, on the existing underlying framework, is unlikely to be capable of developing critical analytical skills. Pattern recognition can only get you so far - and how far that is will not equate to a first. It also doesn't get you into the good university in the first place (domestically at least - you have to do well in the written exams to even apply) So the good students who get the top grades from top universities are still distinguished from the flock. That's something employers can hang onto, at least.

Although whether people would be in favour of the Loxbridge triangle further solidifying their grip is another question altogether.



 
   
Made in us
Battlefield Tourist




MN (Currently in WY)

We should be teaching them how to use these tools really well, as opposed to avoiding them. In the future, these will be the tools everyone will be using.

Support Blood and Spectacles Publishing:
https://www.patreon.com/Bloodandspectaclespublishing 
   
Made in gb
Calculating Commissar





England

 Easy E wrote:
We should be teaching them how to use these tools really well, as opposed to avoiding them. In the future, these will be the tools everyone will be using.

On the whole, I agree. But it won't be a lot of any course becase LLM AI is quite limited currently, and unlikely to become significantly less limited as it is entirely derivative.

 ChargerIIC wrote:
If algae farm paste with a little bit of your grandfather in it isn't Grimdark I don't know what is.
 
   
Made in us
Master Tormentor





St. Louis

 Easy E wrote:
Recall is old-school. Research and analysis is the future.

Gonna be honest: That's always been the case. The modern industrial school system just does a piss poor job at teaching those skills, instead focusing on basic skills needed for the labor market (math, basic proficiency at writing and reading comprehension, etc.). That's why a strong education in the humanities is important: They do teach those research and analysis skills, especially in the fields of history, social sciences, and the arts.
   
Made in gb
Longtime Dakkanaut



London

 Haighus wrote:
 Easy E wrote:
We should be teaching them how to use these tools really well, as opposed to avoiding them. In the future, these will be the tools everyone will be using.

On the whole, I agree. But it won't be a lot of any course becase LLM AI is quite limited currently, and unlikely to become significantly less limited as it is entirely derivative.


I look forward to the future when most of the material LLMs find is other LLM content...
   
Made in gb
Calculating Commissar





England

The_Real_Chris wrote:
 Haighus wrote:
 Easy E wrote:
We should be teaching them how to use these tools really well, as opposed to avoiding them. In the future, these will be the tools everyone will be using.

On the whole, I agree. But it won't be a lot of any course becase LLM AI is quite limited currently, and unlikely to become significantly less limited as it is entirely derivative.


I look forward to the future when most of the material LLMs find is other LLM content...

Ah yes, when the internet is taken over by bots scamming each other. See X for a sneak preview.

 ChargerIIC wrote:
If algae farm paste with a little bit of your grandfather in it isn't Grimdark I don't know what is.
 
   
Made in gb
[SWAP SHOP MOD]
Killer Klaivex







 Easy E wrote:
We should be teaching them how to use these tools really well, as opposed to avoiding them. In the future, these will be the tools everyone will be using.


Yeah, we hear this a lot from AI prophets. Usually the same people who were selling crypto last week.

AI has some academic/workplace use. But only as a supplement, because ultimately, being able to line up a good AI prompt only takes about a day to learn. It's not something you really need to be educated in or spend time studying. Although I'm sure the hordes of new 'AI artists/authors' would disagree, as they hit 'generate' over and over, whilst wiping the non-existent sweat from their brows.

Not to mention that going out of our way to push it at uni sends something of a very mixed message when we -don't- want them to use it there. Even at a most fundamental non-problematic level, having a student rely on AI to get all their spelling right means they never learn to spell correctly. Having the computer re-lay the essay means the student never learns how/why different styles of formatting are important. This issue increases to the extent to which the student relies on the computer. There's a reason we still teach arithmetic in schools when a calculator can do it for them.

Otherwise, the human race ends up like this:



This message was edited 1 time. Last update was at 2024/10/12 09:39:33



 
   
Made in gb
Decrepit Dakkanaut




UK

The biggest thing with AI use isn't prompting its reviewing whatever it created for you. Which honestly many times to get to "as good" a standard as a real thing often takes just as long as doing the thing originally.

The big area for AI is people who can't do a skill using it to do that skill and then not being able to see the errors or problems because they aren't skilled in that field.

OR those who are skilled and accept that the AI will produce a lesser result, but its "good enough" and cheaper for a simple basic task.


The workplace problem is managers who latch onto option B and use it to justify cutting back on investing in key areas of a firm that once had workers.




There's also the timebomb that part 2, in theory, over time will improve. Though I do wonder by how much if the internet is used to train all the time, AI trainers will have ot keep finding archives of proven human creation to train their AI on to avoid AI training itself on its own creations and thus confusing itself

A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in de
Joined the Military for Authentic Experience






Nuremberg

People compare AI to calculators a lot, and I think it's a good comparison. Calculators remove the need to develop certain skills by doing it for you. And that's great, if that's all you want to do, just do simple operations.

But by outsourcing that thinking to the calculator, you've cut yourself off from developing the skill yourself, and there are other things the calculator cannot do that you now also cannot do, and cannot progress to, because your number skills are so weak.

You see this most clearly with fractions. Kids learn that fractions can be expressed as decimals, and that the calculator can trivially do operations with decimals, and then they think "I'm never doing fractions again" and that works right up til they need to algebra and then breaks down. So many kids get caught at that point and it hampers their development hugely.

AI will be the same, for all the things it does, like making summaries, structuring a written argument, and so on.

I'm completely against it, as an educator. I'm there to teach kids how to think, not to outsource it to a machine because it's convenient.

In education, if I set you a task, it is for your development and so that you can be measured. The task being done by LLM means the task is utterly pointless. Doing it more quickly and with less effort defeats the entire purpose.

   
Made in gb
Decrepit Dakkanaut




UK

I think AI goes beyond Calculators. Calculators purely do a function that you present to it. You still have to create the function for it to perform.

AI isn't just performing a function its creating the function in the first place from prompts. It's doing all the fun creative things like writing, coding, drawing, sculpting (far as I know digital sculpting is still lagging but they are working on it).


It's doing all the expressive things people wanted to do in life from prompts. Sure it gets a lot of fine details wrong; it has no contextual understanding and its heavily reliant on existing mass market media/material to derive and it can't tell if its right or wrong.

Sadly a lot of people don't care about all that and won't notice. AI is currently the ultimate Dunning Kruger

A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in us
Decrepit Dakkanaut





I guess where I take my view of AI and its use in professional/academic settings is that we need to ask the question: is AI supplementing human skill, or replacing it?

To explain what I mean, I'll use an example from a tour I took some years ago. During uni, in business school, we took a tour of the BMW manufacturing plant in Munich with the museum. Our tour guide wasn't "just" a tour guide, as one would usually get. I guess my prof had some in with the company.

Anyhow, Basically, this middle manager explained that, as often as they can, they seek solutions that do not cut labor unless absolutely necessary, or cuts it with retirement (as in, we'll hold on to a worker, but when a technology has replaced them, we keep them on until they retire, then just don't fill the role anymore). So, there's a section where they need to install seats. The seat installer people wear an exo-suit robot. Instead of two workers lifting a 20 lbs weight, 100 times a day, this robot makes it feel like they are lifting 1 lbs 100 times a day. The thinking is that they want to make the job less physically demanding, to allow the worker to actually enjoy their time off/away from work.

In contrast, all of my classmates and I agreed, if General Motors did this robot suit, they would do it to allow the company to have one worker lift the thing 200 times a day, and fire one worker (or move the worker to another line and they are lifting 200 times as well)


Back to AI. . . There's a Dell (I think??) commercial for the company's AI research arm, where a daughter gets ahold of this scrap of super old, weather worn paper and, using company property and time, recovers grandma, or great grandma's old soup recipe, which is then cooked and fed to her dad, making him the happiest clam in the sea.

We have professions who set about studying and recreating old documents, trying to decipher what a word may have been when the surface has water damage and smudged off letters or parts of words. An AI system could theoretically aid them in their already existing skills.

But as has been discussed in this thread already, these programs and "skills" shouldn't be used by undergrads doing coursework assignments, as they don't have the requisite skills to know whether the output is good or not. The balance then is, does a university, who needs researchers who are recovering and digitizing old documents and whatnot, do the European thing, allowing the same number of researchers work better/more efficiently, or does it do the American thing and cut positions while expecting the same or more output?
   
Made in gb
Decrepit Dakkanaut




UK

I think the issue is that we can all see that there's a good side to AI, especially in a world with increasing amounts of information.

Also there is AI in things like image restoration - we've had it for ages it just wasn't always marketed as "AI" because AI is the new buzzword every manager wants.

The issue people take right now is indeed when its being used to cheat education or to replace professions.


It's one of those situations where the right use can benefit humanity greatly and I think we will get there. The issue is along the way there are some big hurdles and it can go sideways.



Another thing that people aren't as hot on yet, but will be, is personal data. Even though AI isn't supposed too, it very much can do a LOT of very creepy very advanced personal data harvesting and linkup. There's also the fact that it relies on being online the whole time and I can honestly see that at some point there's going to be a generation or three that push back heavily to an "offline life" and really do get hot about their digital privacy.

It's like the MS AI that screenshots your screen every few seconds. People want data backup, but they don't want all their personal interactions and such being recorded on some server in some country. Because data breaches happen; data abuse happens and more.

A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in gb
Decrepit Dakkanaut







Haven't MS been saying those screenshots are stored on your machine, and never fed back to head office?

At which point my next question becomes "Why do you think you can fill my hard drive with these without asking?"

2021-4 Plog - Here we go again... - my fifth attempt at a Dakka PLOG

My Pile of Potential - updates ongoing...

Gamgee on Tau Players wrote:we all kill cats and sell our own families to the devil and eat live puppies.


 Kanluwen wrote:
This is, emphatically, why I will continue suggesting nuking Guard and starting over again. It's a legacy army that needs to be rebooted with a new focal point.

Confirmation of why no-one should listen to Kanluwen when it comes to the IG - he doesn't want the IG, he want's Kan's New Model Army...

tneva82 wrote:
You aren't even trying ty pretend for honest arqument. Open bad faith trolling.
- No reason to keep this here, unless people want to use it for something... 
   
Made in gb
Calculating Commissar





England

 Dysartes wrote:
Haven't MS been saying those screenshots are stored on your machine, and never fed back to head office?

At which point my next question becomes "Why do you think you can fill my hard drive with these without asking?"

This was true of all the data on your hard drive. Then One drive became opt in. Then One drive became opt out without fanfare. I don't trust big tech with that kind of info.

This message was edited 1 time. Last update was at 2024/10/12 18:38:37


 ChargerIIC wrote:
If algae farm paste with a little bit of your grandfather in it isn't Grimdark I don't know what is.
 
   
Made in gb
Decrepit Dakkanaut




UK

 Haighus wrote:
 Dysartes wrote:
Haven't MS been saying those screenshots are stored on your machine, and never fed back to head office?

At which point my next question becomes "Why do you think you can fill my hard drive with these without asking?"

This was true of all the data on your hard drive. Then One drive became opt in. Then One drive became opt out without fanfare. I don't trust big tech with that kind of info.


Yeah the default save option for all MS Office software is now the Drive. You actually have to fiddle around to save locally.

The big dream of big business is to have all your data stored their end not your end.
At the innocent level it lets them have 1 software version across the world to maintain and upkeep and they can monitor basic use (features and so forth) and see what users are wanting, using and doing.
Of course there's many layers below that where if they've got the info they can deep dive into it; or someone with nefarious intent can; even if that nefarious intent is just to work out more of who you are so they can live-feed more ads that work for you to you.


But yeah this is why I think at some point there's either going to be a big off-line push in tech or even an anti-tech push. Even with all the work convenience tech brings, at some point people get worried about personal data.
Though I think we are a good way off yet; we've generations growing up on phones and for many phones have replaced computers for a lot of peoples digital needs. I can easily see that whilst PC users might push back; phone users would be more than happy to have everything run "server side" so their phone can be both super-powerful (because its just streaming data) and affordable .

A Blog in Miniature

3D Printing, hobbying and model fun! 
   
Made in gb
Leader of the Sept







In my work, the best use of LLM stuff isn’t necessarily content creation, it’s content retrieval. There is so much stuff that we create and recreate endlessly because the person needing to do a particular job is t aware of the similar stuff that has been created by others in the past. Being able to serve up past solutions to the staff for critical assessment against new challenges is a great use.

The down side is that the endless recreation of solutions is quite handy in training people in the job.

Please excuse any spelling errors. I use a tablet frequently and software keyboards are a pain!

Terranwing - w3;d1;l1
51st Dunedinw2;d0;l0
Cadre Coronal Afterglow w1;d0;l0 
   
Made in eu
Frenzied Berserker Terminator




Southampton, UK

Ah that old chestnut... "If HP knew what HP knows, we'd be three times more productive"

All the little bits of knowledge that people collate and manage for their own convenience, but never get shared any further...
   
 
Forum Index » Off-Topic Forum
Go to: