You don't need to be an 'investor' to invest in Singletrack: 6 days left: 95% of target - Find out more
I think my GPU has gone pop (I'll be testing another card tomorrow to make sure, but I'm not getting any output from any of the ports, and I can't remote into the PC either which makes me think it's stalling during boot - though I guess it could be something else in the box).
Anyway - since I've just got into playing Cyberpunk 2077, that'll mean a new GPU.
I'm not a super heavy gamer so I don't want or need to spend loads - but I do want something reasonable. Gaming at 1080p, occasional Blender, occasional Photoshop and Lightroom.
Current set up is an i5 12400f, 32 gb of ram and an old GeForce 1070.
I was planning on sticking with Nvidia and going for a 4060 for around £270, which looks to be a decent fit. But then I saw the new Intel B580 which is around the same price and seems to be getting some pretty stellar reviews (and has 12 gb rather than 8, which seems like a useful uplift.
Has anyone tried the new Intel GPUs, or got any good reviews to read? Or anything else I should be looking at? AMD competitors seem to have more ram, but lower overall performance.
anything else I should be looking at?
A console?
AMD cards are decent as well, I've got an AMD 7600XT on my 1080 machine in the study, which is a 12400f with 32GB of ram like yours. The card has 16GB of RAM which seemed like a better choice for longevity than nvidias offerings at that price point.
I recall the RTX 4060 being a widely criticised card at the time of release. Not sure how it'd stack up a year later mind you
Do you have a max budget?
Over the years I've settled into Intel for CPU, and Nvidia for GPU.
Over the years I’ve settled into Intel for CPU, and Nvidia for GPU.
Both of which are maybe not the best choices of the current generations - intel CPUs have had unusually high failure rates in the last few generations. And Nvidia, while their top end cards are still the best, have become hugely expensive, especially for their mid-lower end offerings (especially on release of a generation of cards, the mid gen refresh tends to be what the card should have been to start with), where they're trading on the reputation of the top end stuff.
In my experience VRAM plays a bigger role than is often mentioned in some performance reviews. There are games I play that as you increase textures and detail settings VRAM comes in to play. If you're playing at 1080p with global presets at low / med then it shouldn't be an issue. But if you like more detailed textures and smoother edges more VRAM might be needed.
What I'm waffle on about is an AMD card for a similar budget that has a performance 5% below NVidia might cope better with games with high VRAM demand. I think Cyberpunk falls into that catagory.
Depends on your priorities, Max fps and less detail, lower fps more detail?
I would sit on it a few weeks as the new NVidia cards will be announced/released.. which potentially cause a slight pricing shake up
In my mind, CYberpunk 2077 requires an Nvidia card.. hands down......I havent used an AMD card this century, i would consider one for my lads pc however. And intel, isn't on my radar right now
This is quite an in depth review of the new intel cards (i've not watched it all)
It covers some handy comparisons to other cards too.
It covers cyberpunk (at 1440p) at about about the 14m40sec mark - it does surprisingly well, but bear in mind they are using a more powerfull CPU, but its also running at higher resolution...
I'd watch the full video to get a full overview before dropping the cash though... it looks like a good contender, soundly whooping the Nvidia 4060's ass.
A console?
I have a PS4 which occasionally gets used. Not much cop for Blender or Lightroom, though.
Do you have a max budget?
Probs around £300.
If you'd have asked me a day or so ago, I'd have said Nvidia no questions. But as mentioned above, the 4060 wasn't exactly universally adored, and it sounds like 8 gb of ram is going to be a problem in the next few years (my 1070 had 8 gb, and that's from 2016).
And there are some pretty compelling reviews for the Arc B580 at the moment, but it's obviously pretty new still.
I'm not writing off AMD either, the 7600 XT could definitely be an option.
Recently got a NVIDIA RTX 4070 (paired with AMD 7700 processor). Very happy so far. Doesn't get too hot which is important in the Ghost S1 case I have.
Note, I mostly play Command and Conquer so leading edge graphics is not needed, but I do like to use a sledgehammer to crack nuts 🙂
A console?
Like clockwork 😀
In my mind, CYberpunk 2077 requires an Nvidia card
I'd definitely put AMD on the list. Even the older RX 6800 is still a very relevant GPU, has 16GB and will happily chug along at 100+ fps 1080p (RT off). I'm playing on a Radeon 7900XT and it's great. Close to 100fps at 1440p with ultra settings and RT on. Plus it was about 60% of RRP from Ebuyer's Ebay store during the Black Friday nonsense, so maybe hang fire until the Boxing Day sales to see if a similar deal comes up.
First hit on google comparing the 7600xt and 4060 in cyberpunk
So they're similar, fwiw I think the 7600xt will have greater longevity due to the extra vram - which was my deciding factor in getting it.
I'd be looking at the B580 at that pricepoint. A 4060 is probably a safer option, but intel's drivers have came on alot and the B580 is better vfm than a 4060. Neither are going to be ray tracing really, so thats not a factor for me. At 4070 ti level and above I'd go Nvidia for Ray Tracing and frame gen though (which works better when youre already above 60+frames).
Yeah a 4070 ti would be lovely but I just can't justify the cost for the use it'll get.
I will be disappearing down a YouTube rabbit hole over the next few days...
I've just picked up cyberpunk - I was going to buy it anyways, but being able to test it on a system very similar to yours might be useful. I'll post up here once it's downloaded.
I wouldn't worry about RT at your budget... it's an expensive 'nice to have' but none of the budget/mid range cards do it that well without a big performance hit.
There's a review of the b580 on tech spot. Looks like a decent vfm card but they've been selling fast so getting hold of one might be a challenge.
Quick update - Ran the benchmarks on my system as follows (cba fighting image posting)
Results: Average FPS: 82 - Min FPS: 66.99 - Max FPS: 103.86 - Time: 64.25 - Number of Frames: 5269
System Specs: RX7600XT, 16Gb VRAM, 12400F CPU, 32GB system RAM
Custom Preset: Texture quality: High, Resolution 1920x1080, Windowed borderless mode, Vsync off, max fps off, frame generation off, DLSS off, Ray tracing off.
Had a play with teh framegen/supersampling/ray tracing but it didn't look great and RT doesn't bring that much to the graphics anyways, imo
I would and did buy a 1080 Ti rather than a 4060 as it's much cheaper for slightly better general performance at the cost of much higher power consumption.
Turns out it was my RAM at fault not the GPU after all. Or, actually the motherboard. Two RAM slots just appear to have died.
In my mind, CYberpunk 2077 requires an Nvidia card.. hands down……
That's nice.
Meanwhile in the real world, albeit out of the OP's budget, AMD 7900XT's and GRE's outperform the Nvidia 4070 Ti Super.
https://www.lttlabs.com/articles/gpu/amd-radeon-rx-7900-xt
I’m playing on a Radeon 7900XT and it’s great. Close to 100fps at 1440p with ultra settings and RT on. Plus it was about 60% of RRP from Ebuyer’s Ebay store during the Black Friday nonsense
What?!? You lucky bastard!
Fps per pound the 4060 is a cheap alternative but th 7700xt from Amd would be much better and only slightly more money.
Like clockwork
Sorry if it's tedious, I don't mean to be that "have a Windows problem, install Linux or buy a Mac" guy but I honestly think it just doesn't occur to some people how simple it is.
As the OP says it's no use if you need it for applications, but it's difficult to see how you can improve on a PS5 or a Series X|S for gaming today. One of the head cheeses at Microsoft said recently that current-gen console tech is as good as it's going to get (which I don't believe but hey). I can guarantee that any Xbox One / X|S game and many 360 and original Xbox titles I pick up will just work without any fannying about with settings or worrying that my GPU isn't up to snuff.
High-end PC gaming is a hell of a lot of money just to use WASD and a mouse, and as the mere existence of this thread demonstrates it's a minefield to spec / build / keep current. I have a mate who's a long-term PC gamer and he gets all excited when he gets another 6fps out of his graphics card by switching his liquid cooling from water to unicorn tears or something and I just don't get it. Playing a high-speed game like driving an S-class car in Forza on a 75" 4k TV where frame rate might actually be relevant to anything, I'm not admiring the scenery, I'm working out when I can next afford to blink.
Yours, a former PC gamer with no regrets about jumping ship. Doubly so now with the rise of GPU-assisted crypto cracking ramming the price of GFX cards sky high.
I've got an Intel A770 which was cheap enough in the summer (under 220 I think) to give a go as I was building a new Intel-based ITX PC. I'm really not that into the benchmarks etc or the latest AAA stuff but some videos I watched seemed to reckon that it had much improved with driver updates over the previous year or so and was now very decent for the money, if still a bit underperforming on some games. 4060 or 7600XT rivalling or bettering but at a lower price.
I've been very happy with it, most demanding thing I'm playing is still Forza Horizon 4, and it manages 3440x1440 high at 120+ fps.
at 120+ fps.
My point entirely. Why would you know, let alone care?
This is my aforementioned mate Rob. He has software running which monitors/logs fps and he's done this since, ye gods, the days of Voodoo 3dFX cards at least. He has spreadsheets for gods' sake. The human eye craps out somewhere below 50Hz (which is why mains AC frequency is what it is, otherwise you'd see incandescent bulbs flickering). Beyond that we're into realms of Hi-Fi grade woo. I defy anyone to tell the difference between (say) 60fps and 120fps in a blind test.
I don't sit here playing Forza thinking "well, it's alright, but it's only 119fps so it could be improved, best go spend another £600 on a nominally better graphics card." Unless you're a Peregrine Falcon it's utter nonsense.
Play the damn game and stop fretting about numbers. If you can't tell then it doesn't matter.
at 120+ fps.
My point entirely. Why would you know, let alone care?
I can really tell if my fps drops much below about 60 for less than a few milliseconds or whatever... It just feels janky and the stuttery.
For example on forza horizon 5 my fps fluctuates between 70 and about 120 on average which is really nice and smooth.
In particularly heavy scenes... Mostly player made tracks with billions of assets and lights etc it can spike down to about 50fps and you can really feel it stutter.
It's not realy an issue in slower paced games as there is not much going on visualy to make any dips in frame rate stand out.
EDIT will say that I'm totally fine if frame rates never spike below about 60..even for a second.. that's fine for me and allows me to dial up the graphics settings to maintain a consistent 60+fps,...
if for example a game is running at well over 100fps consistently I'll ramp up the graphics quality until it hits frame rate too badly...i.e. It's runingoat circa 70fps consistently.
With frame rate it's the momentary low spikes in 'busy' moments that you really feel.
Extreme example, I'm dual booting with Linux and I can't get forza to run more than about 40fps.. And It's basically unplayable..
It looks like some sort of terrible stop motion animation from the 80's
Exactly - on benchmarks I put 1% low performance way above averages.
And if there was no discernible difference between 60 and 120hz, no-one would bother making a display that supports more. What's true of watching a video isn't necessarily true when you're interacting with what's on screen.
Go nvidia- the ray tracing support etc comes to that platform 1st and the ai frame generation is voodoo and will make the card "last" longer. Can you get a second hand 4070 at that price point ?
By the way this is from the stand point of someone who has been nvidia for the last 3 pcs
Up unil last year I was very much in the camp of PC for life. Stated with the same 3Dfx card Cougar mentioned and went though a batshit crazy overclocking phase too.
But I started looking at the reality of things more recently, especially with the GPU price rises. I play most of the games on a console with my mates - sure I could do cross gen but honestly the console just works.
Cyberpunk is one such game, looks pretty nice on PS5. Sure, the PC can go nicer but you are sat closer to the typical PC screen so you need a bit more detail. 1080p on my oldish HRD tv still looks surprisingly nice.
And now I've just picked up a macbook pro for Lightroom & Resolve duties which also just works...
I don't think I ever remember a time when my windows machines didn't need some sort of fettling or other to work nicely. Sure, being an enthusiast I did strive for perfection but it was tiring.
I've also had ram sockets die on more than one mainboard. OK, I've had a lot of mainboards but this has been the only fault I have ever seen.
Go nvidia- the ray tracing support etc comes to that platform 1st and the ai frame generation is voodoo and will make the card “last” longer. Can you get a second hand 4070 at that price point ?
By the way this is from the stand point of someone who has been nvidia for the last 3 pcs
That just seems like fanboyism to me... RT 'costs' too much in terms of frame rate unless you go 'high end' - on the very medicore RTX4060 with it's woefull amount of onboard RAM, -the 4060 is a total 'nothing sandwich' card, at its current price point compared to the competition from AMD/Radeon and now Intel, too, it seems.
Your are far better off disabling RT and ramping up other graphics settings for a better overall visual experience witnout sacrificing a chunk of FPS, unless your are running a 4080 or a 4090, and who in their right mind would spend that much on a GPU? clearly not me! but some peple do and that's fine, your money, your choice.
And if there was no discernible difference between 60 and 120hz, no-one would bother making a display that supports more.
Nonsense. The only reason no-one would make them is if no-one could market them so that people would buy them. Sony were kicking out 100Hz Trinitron CRTs back in the days where your high-end source was a PS2.
I can really tell if my fps drops much below about 60 for less than a few milliseconds or whatever… It just feels janky and the stuttery.
Sure. When it drops below the point where it's lagging then you're going to notice of course. If it's above the threshold where it's smooth then more isn't more, it's just bigger numbers and e-peen waving.
Sure. When it drops below the point where it’s lagging then you’re going to notice of course. If it’s above the threshold where it’s smooth then more isn’t more, it’s just bigger numbers and e-peen waving.
Agreed... If my 1% lows never dip below say 60 fps I don't really care..
But it's jarringly obvious in fast paced games when it does, so then you have two options.. Lower your graphics settings to bring the frame rate higher, or upgrade the pc.
As I said at the head of this thread, there is another option. 🙂
I saw someone on a Reddit thread talking about needing 350 FPS. Thought I must have misread, what screen would do that?
Anyway yeah, if it was 100% gaming then tbh I'm with Cougar, I'd probs go for a PS5 or and Xbox. But it needs to support my occasional tinkering in Blender and (to a lesser extent) photo editing, so I stick with the PC.
Plus, I have a shed load of games on Steam (plus literally hundreds of Epic freebies).
But last time I played (with unoptimised RAM) I managed a good few hours with no crashes. So I reckon we're good.
A fair point, but. . There's always a but.. New consoles are also a bit of a con.. Spend 600 quid on a console.. Then buy a game you like for say 50 quid...
And then you have to pay a monthly subscription on top of that to actually play the game online?
That’s where the value for money argument gets a bit blurred... Yes a decent pc will cost you a grand plus, but then you have a lot more options. Not to. Mention being able to do computer stuff.. A console is a one trick pony.
That's not to say I'm against consoles... Quite the opposite... My mate asked me to spec him up a self build 'gaming ' pc the other year.. And I straight up said buy a PS5 or a top spec xbox, and a windows laptop.
The beauty of a pc is you have a lot of options depending on what you want.
I mean.. I'm I'm currently running an undervolted i5 and an rx 6800 xt, and it comfortably pisses all over new gen consoles at 1440p... And that's a pc that is a generation or two old.
And then there's the fan noise on new gen consoles...I don't know about anyone else, but I prefer to not listen to what sounds like a helicopter trying to take off in my bedroom/living room.
Why do they try to scupper the cooling by putting consoles in such small and restricted cases?
I saw someone on a Reddit thread talking about needing 350 FPS.
People say a lot of things on reddit, lol!
I actually think a lot of stuff on reddit is bot-driven these days... it's pretty obvious when you look at the language being used.
Yeah this is probably true. There's some good stuff on Reddit, there are some nutters too.
New consoles are also a bit of a con.. Spend 600 quid on a console.. Then buy a game you like for say 50 quid…
And then you have to pay a monthly subscription on top of that to actually play the game online?
I can't comment on the PS model. But Game Pass Ultimate on Xbox gives you more games than you can possibly play for like a quid a week. AAA titles routinely drop on Game Pass, often platform exclusives, it would have to be something exceptional for me to pay actual money for. I'm currently playing the new Indiana Jones game, it cost me nowt. If you were to buy one game a year at that price point, you've paid for the sub.
Not to. Mention being able to do computer stuff.. A console is a one trick pony.
Yeah, but what a trick. Who owns a console and doesn't already own some form of PC for "email, web browsing and occasional Office use" which is the subject of a "recommend me a laptop" thread here on like a weekly basis.
And then there’s the fan noise on new gen consoles…I don’t know about anyone else, but I prefer to not listen to what sounds like a helicopter trying to take off in my bedroom/living room.
It's broken? I have a Series X and you can't tell it's on aside from the power LED. My PC is noisier. Maybe Sony's offering is different, idk.
I've had a series x and now a ps5. The series X is a better designed piece of hardware and probably marginally quieter, but the ps5 isn't noisy.
Sorry if it’s tedious...
...As the OP says it’s no use if you need it for applications...
OP mentions the need for e.g. Blender, Lightroom - and you recognise the same - but you still recommend a console. And you do it enough when people ask specifically about PC hardware that I've noticed a pattern. I'm not having a pop at all, just pointing out the amusing regularity.
Game Pass Ultimate on Xbox gives you more games than you can possibly play for like a quid a week.
Unless they get de-listed. Then your shafted if it's your favorite game and you lose your saves unless you just buy the game in the first place.
Problem with a lot of the mid-range Nvidia 40xx series is the memory bandwidth, they crippled the cards with a 128bit bus unless you go 4070 or above. AMD there's leaks starting to happen on the 9xxxx series so likely a launch there and some bargains to be had on the 7xxx range for a little while.
Intel seem to have a decent winner on their hands with their latest release, when you consider that over time performance is likely to increase as they optimise the drivers.
Toms Hardware GPU hierarchy is a good rough guide on where various GPU's are against each other https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
OP mentions the need for e.g. Blender, Lightroom – and you recognise the same – but you still recommend a console. And you do it enough when people ask specifically about PC hardware that I’ve noticed a pattern. I’m not having a pop at all, just pointing out the amusing regularity.
I genuinely missed that original requirement.
Unless they get de-listed. Then your shafted if it’s your favorite game and you lose your saves unless you just buy the game in the first place.
Games do sometimes get retired yes, but it's not commonplace and you can always just buy them for very little money at that point.
I don't know where you've got the notion from that you lose your saved games but that's incorrect, I've got saves going back to Xbox 360 days.
Whatever, man, if you are happy to pay a subscription, then crack on.
A fair point, but. . There’s always a but.. New consoles are also a bit of a con.. Spend 600 quid on a console.. Then buy a game you like for say 50 quid…
And then you have to pay a monthly subscription on top of that to actually play the game online?
Unless they get de-listed. Then your shafted if it’s your favorite game and you lose your saves unless you just buy the game in the first place.
And then at some point they turn off the service. Like Wii, various DS stores, Wii U, Xbox 360, various PS Stores. What then? Stick in your Day 0 disc and hope like hell it'll boot or even recognise the saves (if it even has any assets and isn't a glorified token for the download)? What then?
As said, it's a false economy. I have a One S and Switch that will likely be e-waste in a few years. Yes, I made a rod for my own back with Ultrawide, I accept that. I don't like it but here we are. If I was playing at 1080 a 4060, 7600 or Arc would be perfect. It's only above 1080 that you need more grunt as games become GPU bound.
PCs for gaming are genuinely more affordable, my last rig was a QX6850 with 8gb of RAM and a Radeon 7850 which is still going strong. I was playing all the games I play now but at lower graphics settings. You don't need the latest and greatest. Hell, I refused to upgrade my One S as my TV doesn't have HDR let alone 4k.
yep my comment above, it was always Nvidia for me, my last experience of AMD was during the crypto mining phase and i found it comedic how fussy AMD cards were just to run properly... so they have sat on the back burner of my mind. and then Nvidia took on RT and frame generation so that's where i shop. That being said i go high end and expect to replace with another high end card 4 years on. I go for visuals rather than fps as i'm not competitive, as long as it a smooth performance with no stutters etc and stick to variable refresh i'm happy. Also, don't use frame generation as the last few titles i used it on were giving me horrible shimmers around my character...
So yer, i want RT, i notice it, i just want to set max visuals and go with it, and as long as i manage 60+ fps that's lovely
re the games etc, so i am a humble bundle subscriber since day 1, so 8? games a month, and there is always one in that i really want to play. I generally only buy one full price ~35 quid game a year or less. And if i complete a game that's it, never to be played again, so i expect every purchase to last about 15-50 hours of playtime, a £50 console game doesn't interest me so much.
Re PC stability, past 2 PC's I've built have been rock solid stable, no fettling or maintaining.. they just work
You shouldn't hang your hat on one manufacturer. That's how the cycle of companies phoning it in perpetuate, once people start looking elsewhere they correct themselves.
Nvidia don't give a single solitary **** about consumers, we're not their market, they're too busy servicing the AI industry. So what they do sell is expensive because they know people will pay what a whole system used to cost. Similarly AMD aren't innovating and have no flagship 8000 series card. Meanwhile Team Blue jump into the mid tier with an affordable range and are hopefully going to give the market the kick up the arse it needs.
Conversely Intel are throwing away their CPU side and AMD are starting to stagnate whilst ARM and Risc5 manufacturers are waiting in the wings. It's all cycles...
I do agree with the stability argument, overclocking is a click of a button and is no harder than hitting the turbo button on Ye Olde 486DX.
Whatever, man, if you are happy to pay a subscription, then crack on.
Thanks!
I've just paid £105 for two years. If in that time I was instead to buy three full-price games, I'd be worse off.
And then at some point they turn off the service.
Maybe so. At which point if I really want to play a game that old then it'll probably be in CEX or Game for 50p. The Xbox 360 is almost 20 years old, at this juncture it owes me nothing. If 360 online is stone dead - I haven't tested it to see what does and doesn't work still - about the only thing I'll miss is playing late night Liars Dice with mates in Red Dead Redemption.
What's this obsession with saves all of a sudden?
if the OP is looking for a new GPU the boxing day sales from CCL have started for a little more than the budget a 4070 could be had and that would last years
https://www.cclonline.com/all-deals/winter-sales/filter/graphics-cards/
I've decided to stick with what I have for now.
Those 4070s do look a bargain though. More than that second hand on CEX!
What’s this obsession with saves all of a sudden?
You mean being able to carry them with you from one device to another without needing to pay for an online service to store them? Some people like that. PCs still give you full control of your library and let you buy stuff DRM free without having to rely on a third party serving the installer ad infinitum.
Obviously PC gaming isn't for you and that's fine. Others see it as a better value proposition, experience or whatever and that's equally fine. Tell you what, why don't we all pile into a console thread and tell everyone they should have a Steam Deck instead? Or a board game thread and tell everyone they should be playing deck games? Or would that just be crapping over threads we're not otherwise interested in?
Look.
It was a sincere suggestion, I misread the original request as I've already explained (and the OP agreed with me). Quite why you feel the need for a pile-on I don't know. I've nothing against PC gaming, I've got over 300 titles in my Steam library alone.
And for what it's worth you don't need a paid subscription for cloud saves, it just works (on Xbox at least, dunno about the PS). I've moved local saves around between consoles / PCs plenty of times, it's a pain in the ass. Times have changed since the Playstation 1.
You do for Nintendo, I actually had a whole reply written out that didn't post for some reason and I missed that bit when I wrote a condensed version commesurate with my later level of inebriation.
Local saves are easy on PC, less so on Xbox but still not that bad. The problem is it only needs one greedy shite to demonstrate what bullshit consumers will put up with and before you know it everyone is on the race to the bottom.
And I was only replying to what you wrote. It's not a pile on so much as you trying to convince us all consoles are better. Working out as well as someone wading into a pc/Mac thread and declaring MacO$/Windoze is only for the gullible tbh. Lightroom or not, it's still a viable option, I wish I'd bough a Steam Deck over a Switch, at least I'd be able to play with a better screen and have control over my saves.
I never said it was better. I was offering an alternative and discussing my experiences. You know, on a discussion forum. :shrug:
I've no idea about current-gen Nintendo, the last Nintendo console I had was a Gamecube. In my experience cloud saves isn't just a non-issue, it's a benefit. I'm round at my partner's lad's right now, jumped on his PS5 and picked up my PS3 account from several years ago. If Nintendo requires a paid subscription to access cloud saves (does it?) then it's an outlier, nothing else does. PC included, I think the last time I had to manipulate local saves was Torchlight and that was 15 years ago.
You were coming across pretty preachy with fps talk and ancient history about non-stability (I've not had graphics driver issues since, er, ever. That may be down to the games I play, who knows but even when my kit has been current it's never been an issue)
But yeah Nintendo are a definite outlier and definite arsehole. Gamecube is my other weapon of choice, until they make a new F Zero I have no other choice.
Are you sure you haven't confused me with another poster? I didn't say anything about stability or drivers, That's usually the domain of people who still think "Windoze" is hilarious despite not having touched it since Windows 95.
Lol, I did.
This is why I get annoyed at a lack of proper quote/reply function!
Apologies, as you were.