<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>VideoGamePerfection.com | dead_screem | Activity</title>
	<link>https://videogameperfection.com/members/dead_screem/activity/</link>
	<atom:link href="https://videogameperfection.com/members/dead_screem/activity/feed/" rel="self" type="application/rss+xml" />
	<description>Activity feed for dead_screem.</description>
	<lastBuildDate>Tue, 21 Apr 2026 10:54:38 +0100</lastBuildDate>
	<generator>https://buddypress.org/?v=</generator>
	<language>en-US</language>
	<ttl>30</ttl>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>2</sy:updateFrequency>
	
						<item>
				<guid isPermaLink="false">23515fd959ab3e4fb1a4fd7b4a22e067</guid>
				<title>dead_screem replied to the topic OSSC Pro Input profile detection issue in the forum OSSC - Discussion and support</title>
				<link>https://videogameperfection.com/forums/topic/ossc-pro-input-profile-detection-issue/#post-60046</link>
				<pubDate>Sat, 27 Jan 2024 11:50:22 +0000</pubDate>

									<content:encoded><![CDATA[<blockquote><p>For scaler mode it would be possible to add an option where smallest / highest generic sampling rate preset is always used, leading to selected scaling algo having a bigger impact in horizontal up/downscaling.</p></blockquote>
<p>I don&#8217;t understand what you mean.</p>
<p>&nbsp;</p>
<p>It just makes no sense to do it this way. If I adjust everything for 1080p I then have to redo it all&hellip;<span class="activity-read-more" id="activity-read-more-7135"><a href="https://videogameperfection.com/forums/topic/ossc-pro-input-profile-detection-issue/#post-60046" rel="nofollow ugc">Read more</a></span></p>
]]></content:encoded>
				
				
							</item>
					<item>
				<guid isPermaLink="false">ee4388780fa7f68bc396dedf4837e952</guid>
				<title>dead_screem started the topic OSSC Pro Input profile detection issue in the forum OSSC - Discussion and support</title>
				<link>https://videogameperfection.com/forums/topic/ossc-pro-input-profile-detection-issue/</link>
				<pubDate>Sat, 27 Jan 2024 03:10:54 +0000</pubDate>

									<content:encoded><![CDATA[<p>Input resolution for NTSC SNES gets misdetected for Scaler as &#8220;Gen 1920&#215;240&#8221; if im set to 1080p output and if I have FULL aspect set, if I have it set to 4:3 the input gets detected as &#8220;Gen 1280&#215;240&#8221;. If I set 480p output, input detection is fine. Its as if the output resolution is affecting input resolution detection. This also happens for NTSC&hellip;<span class="activity-read-more" id="activity-read-more-7130"><a href="https://videogameperfection.com/forums/topic/ossc-pro-input-profile-detection-issue/" rel="nofollow ugc">Read more</a></span></p>
]]></content:encoded>
				
				
							</item>
		
	</channel>
</rss>